How did you recompile and deploy Spark to your cluster? it sounds like
a problem with not getting the assembly deployed correctly, rather
than your app.

On Tue, Oct 14, 2014 at 10:35 PM, Tamas Sandor <tsan...@gmail.com> wrote:
> Hi,
>
> I'm rookie in spark, but hope someone can help me out. I'm writing an app
> that I'm submitting to my spark-master that has a worker on a separate node.
> It uses spark-cassandra-connector, and since it depends on guava-v16 and it
> conflicts with the default spark-1.1.0-assembly's guava-v14.1 I built the
> latest from spark git master (it was fixed in late Sept), so now I have a
> working  spark-assembly-1.2.0-SNAPSHOT-hadoop2.4.0 running.
>
> I have my uber-jar that has hadoop-client and spark-assembly as
> scope:provided, excluded from the deployed jar and than it gets submitted to
> a spark-master from the node. From the logs I see  taskSetManager throws me
> an error coming from my worker node saying
> "java.lang.NoClassDefFoundError:org/apache/spark/Partition" - I guess valid
> since my jar has no spark deps inline (uber) but why it cannot see the
> workers classpath - this what a "provided" scope would mean here?
>
> How can I fix that? Am I missing something obvious?
> Thank you for your help.
>
> Regards,
> Tamas

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to