Hi Sebastian,

That exception generally means you have the class loaded by two
different class loaders, and some code is trying to mix instances
created by the two different loaded classes.

Do you happen to have that class both in the spark jars and in your
app's uber-jar? That might explain the problem, although I'm not
terribly familiar with Spark's class loader hierarchy.


On Thu, May 29, 2014 at 5:51 AM, Sebastian Schelter <s...@apache.org> wrote:
> Hi,
>
> I have trouble running some custom code on Spark 0.9.1 in standalone mode on
> a cluster. I built a fat jar (excluding Spark) that I'm adding to the
> classpath with ADD_JARS=... When I start the Spark shell, I can instantiate
> classes, but when I run Spark code, I get strange ClassCastExceptions like
> this:
>
> 14/05/29 14:48:10 INFO TaskSetManager: Loss was due to
> java.lang.ClassCastException: io.ssc.sampling.matrix.DenseBlock cannot be
> cast to io.ssc.sampling.matrix.DenseBlock [duplicate 1]
>
> What am I doing wrong?
>
> Thx,
> Sebastian



-- 
Marcelo

Reply via email to