Hi scrypso,
Sorry for the late reply. Yes, I did mean spark.driver.extraClassPath. I
was able to work around this issue by removing the need for an extra class,
but I'll investigate along these lines nonetheless.
Thanks again for all your help!
On Thu, Dec 15, 2022 at 9:56 PM scrypso wrote:
>
Hmm, did you mean spark.*driver*.extraClassPath? That is very odd then - if
you check the logs directory for the driver (on the cluster) I think there
should be a launch container log, where you can see the exact command used
to start the JVM (at the very end), and a line starting "export
Hi scrypso,
Thanks for the help so far, and I think you're definitely on to something
here. I tried loading the class as you suggested with the code below:
try {
Thread.currentThread().getContextClassLoader().loadClass(MyS3ClientFactory.class.getCanonicalName());
logger.info("Loaded
I'm on my phone, so can't compare with the Spark source, but that looks to
me like it should be well after the ctx loader has been set. You could try
printing the classpath of the loader
Thread.currentThread().getThreadContextClassLoader(), or try to load your
class from that yourself to see if
Thanks for the response, scrypso! I will try adding the extraClassPath
option. Meanwhile, please find the full stack trace below (I have
masked/removed references to proprietary code)
java.lang.RuntimeException: java.lang.RuntimeException:
java.lang.ClassNotFoundException: Class
Two ideas you could try:
You can try spark.driver.extraClassPath as well. Spark loads the user's jar
in a child classloader, so Spark/Yarn/Hadoop can only see your classes
reflectively. Hadoop's Configuration should use the thread ctx classloader,
and Spark should set that to the loader that
Missed to mention it above, but just to add, the error is coming from the
driver. I tried using *--driver-class-path /path/to/my/jar* as well, but no
luck.
Thanks!
On Mon, Dec 12, 2022 at 4:21 PM Hariharan wrote:
> Hello folks,
>
> I have a spark app with a custom implementation of
>
Hello folks,
I have a spark app with a custom implementation of
*fs.s3a.s3.client.factory.impl* which is packaged into the same jar.
Output of *jar tf*
*2620 Mon Dec 12 11:23:00 IST 2022 aws/utils/MyS3ClientFactory.class*
However when I run the my spark app with spark-submit in cluster mode, it