[ 
https://issues.apache.org/jira/browse/SPARK-5770?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15367227#comment-15367227
 ] 

Namrata commented on SPARK-5770:
--------------------------------

Even I ran into the same issue while  reloading the jar file using sc.addJar.  
I am using Job server deployment with long running sparkContext. 

can we change the Executor ClassLoader implementation to maintain a ClassLoader 
per jar File and it will help with the reloading of the modified classes. In 
this way update Jar could be supported but still wont have implementation for 
delete Jar.

> Use addJar() to upload a new jar file to executor, it can't be added to 
> classloader
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-5770
>                 URL: https://issues.apache.org/jira/browse/SPARK-5770
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: meiyoula
>            Priority: Minor
>
> First use addJar() to upload a jar to the executor, then change the jar 
> content and upload it again. We can see the jar file in the local has be 
> updated, but the classloader still load the old one. The executor log has no 
> error or exception to point it.
> I use spark-shell to test it. And set "spark.files.overwrite" is true.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to