Hello guys!

I am using spark shell which uses TranslatingClassLoader.

scala> Thread.currentThread().getContextClassLoader
res13: ClassLoader =
org.apache.spark.repl.SparkIMain$TranslatingClassLoader@23c767e6


For some reason I want to use another class loader, but when I do

val myclassloader = // create my own classloader
Thread.currentThread.setContextClassLoader(myclassloader)

The setContextClassLoader doesn't seem to work. I still get a
TranslatingClassLoader:

scala> Thread.currentThread().getContextClassLoader
res13: ClassLoader =
org.apache.spark.repl.SparkIMain$TranslatingClassLoader@23c767e6


In my previous Java project I can change class loader without problem. Could
I know why the above method couldn't change class loader in spark shell? 
Any way I can achieve it?

Thanks!




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-set-ContextClassLoader-in-spark-shell-tp27094.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to