Where is the exception thrown (full stack trace)? How are you running your
application, via spark-submit or spark-shell?

On Tue, Nov 3, 2015 at 1:43 AM, hveiga <kec...@gmail.com> wrote:

> Hello,
>
> I am facing an issue where I cannot run my Spark job in a cluster
> environment (standalone or EMR) but it works successfully if I run it
> locally using local[*] as master.
>
> I am getting ClassNotFoundException: com.mycompany.folder.MyObject on the
> slave executors. I don't really understand why this is happening since I
> have uncompressed the Jarfile to make sure that the class is present inside
> (both .java and .class) and all the rest of the classes are being loaded
> fine.
>
> Also, I would like to mention something weird that might be related but not
> sure. There are two packages inside my jarfile that are called the same but
> with different casing:
>
> - com.mycompany.folder.MyObject
> - com.myCompany.something.Else
>
> Could that be the reason?
>
> Also, I have tried adding my jarfiles in all the ways I could find
> (sparkConf.setJars(...), sparkContext.addJar(...), spark-submit opt --jars,
> ...) but none of the actually worked.
>
> I am using Apache Spark 1.5.0, Java 7, sbt 0.13.7, scala 2.10.5.
>
> Thanks a lot,
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/ClassNotFoundException-even-if-class-is-present-in-Jarfile-tp25254.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com

Reply via email to