Hi

I ran into problems to use class loader in Spark. In my code (run within
executor), I explicitly load classes using the ContextClassLoader as below.

Thread.currentThread().getContextClassLoader()

The jar containing the classes to be loaded is added via the --jars option
in spark-shell/spark-submit.

I always get the class not found exception. However, it seems to work if I
compile these classes in main jar for the job (the jar containing the main
job class).

I know Spark implements its own class loaders in a particular way. Is there
a way to work around this? In other words, what is the proper way to
programmatically load classes in other jars added via --jars in Spark?

Reply via email to