Re: spark classloader question

2016-07-07 Thread Chen Song
Thanks Prajwal. I tried these options and they make no difference. On Thu, Jul 7, 2016 at 12:20 PM Prajwal Tuladhar wrote: > You can try to play with experimental flags [1] > `spark.executor.userClassPathFirst` > and `spark.driver.userClassPathFirst`. But this can also

Re: spark classloader question

2016-07-07 Thread Chen Song
Thanks Marco The code snippet has something like below. ClassLoader cl = Thread.currentThread().getContextClassLoader(); String packagePath = "com.xxx.xxx"; final Enumeration resources = cl.getResources(packagePath); So resources collection is always empty, indicating no classes are loaded. As

Re: spark classloader question

2016-07-07 Thread Prajwal Tuladhar
You can try to play with experimental flags [1] `spark.executor.userClassPathFirst` and `spark.driver.userClassPathFirst`. But this can also potentially break other things (like: dependencies that Spark master required initializing overridden by Spark app and so on) so, you will need to verify.

Re: spark classloader question

2016-07-07 Thread Marco Mistroni
Hi Chen pls post 1 . snippet code 2. exception any particular reason why you need to load classes in other jars programmatically? Have you tried to build a fat jar with all the dependencies ? hth marco On Thu, Jul 7, 2016 at 5:05 PM, Chen Song wrote: > Sorry to spam

Re: spark classloader question

2016-07-07 Thread Chen Song
Sorry to spam people who are not interested. Greatly appreciate it if anyone who is familiar with this can share some insights. On Wed, Jul 6, 2016 at 2:28 PM Chen Song wrote: > Hi > > I ran into problems to use class loader in Spark. In my code (run within > executor),

spark classloader question

2016-07-06 Thread Chen Song
Hi I ran into problems to use class loader in Spark. In my code (run within executor), I explicitly load classes using the ContextClassLoader as below. Thread.currentThread().getContextClassLoader() The jar containing the classes to be loaded is added via the --jars option in