oh. one more question. are you getting this exception on the front end or
worker? Can you provide the stack trace?

One strange thing is, the backend actually should not ever need the
context, it is only the front end's thing. If you getting this at the
backend, it probably means you are capturing some objects as closure
attributes you do not realize you do -- including the context reference.
Which should not happen. This is a very common Spark programming problem.

if you getting it on the front end, then it is classpath problem in your
driver application and has nothing to do with Mahout itself. make sure to
observe transitive dependency rules for the front end.

On Tue, Feb 2, 2016 at 12:53 PM, Dmitriy Lyubimov <dlie...@gmail.com> wrote:

> this is strange. if you took over the context, added jars manually and it
> still does not work, there's something wrong with spark i guess or
> permissions or those other 1000 things that can go wrong on linux/spark
> deployment.
>
> You can try to add any custom jar to your application to call it at
> backend to test if it works at all.
>
> i guess you can always drop mahout jars into spark classpath on worker
> nodes, as the most desperate measure.
>
> On Tue, Feb 2, 2016 at 9:10 AM, BahaaEddin AlAila <bahaelai...@gmail.com>
> wrote:
>
>> Thank you very much for your reply.
>> As I mentioned earlier, I am using mahoutSparkContext, and MAHOUT_HOME is
>> set to the correct mahout path.
>> I also have tried setting up the context myself as I looked into the
>> implementation of mahoutSparkContext and supplied the jars path manually.
>> still the same error.
>> I will try with spark 1.5 and report.
>>
>> Thank you very much again,
>>
>> Kind Regards,
>> Bahaa
>>
>>
>> On Tue, Feb 2, 2016 at 12:01 PM, Dmitriy Lyubimov <dlie...@gmail.com>
>> wrote:
>>
>> > Bahaa, first off, i don't think we have certified any of releases to run
>> > with spar 1.6 (yet). I think spark 1.5 is the last known release to run
>> > with 0.11 series.
>> >
>> > Second, if you use mahoutSparkContext() method to create context, it
>> would
>> > look for MAHOUT_HOME setup to add mahout binaries to the job. So the
>> > reasons you may not getting it is perhaps you are not using the
>> > mahoutCreateContext()?
>> >
>> > alternatively, you can create context yourself, but you need (1) make
>> sure
>> > it has enabled and configured Kryo serialization properly, and (2) have
>> > added all necessary mahout jars on your own.
>> >
>> > -d
>> >
>> > On Tue, Feb 2, 2016 at 8:22 AM, BahaaEddin AlAila <
>> bahaelai...@gmail.com>
>> > wrote:
>> >
>> > > Greetings mahout users,
>> > >
>> > > I have been trying to use mahout samsara as a library with
>> scala/spark,
>> > but
>> > > I haven't been successful in doing so.
>> > >
>> > > I am running spark 1.6.0 binaries, didn't build it myself.
>> > > However, I tried both readily available binaries on Apache mirrors,
>> and
>> > > cloning and compiling mahout's repo, but neither worked.
>> > >
>> > > I keep getting
>> > >
>> > > Exception in thread "main" java.lang.NoClassDefFoundError:
>> > > org/apache/mahout/sparkbindings/SparkDistributedContext
>> > >
>> > > The way I am doing things is:
>> > > I have spark in ~/spark-1.6
>> > > and mahout in ~/mahout
>> > > I have set both $SPARK_HOME and $MAHOUT_HOME accordingly, along with
>> > > $MAHOUT_LOCAL=true
>> > >
>> > > and I have:
>> > >
>> > > ~/app1/build.sbt
>> > > ~/app1/src/main/scala/App1.scala
>> > >
>> > > in build.sbt I have these lines to declare mahout dependecies:
>> > >
>> > > libraryDependencies += "org.apache.mahout" %% "mahout-math-scala" %
>> > > "0.11.1"
>> > >
>> > > libraryDependencies += "org.apache.mahout" % "mahout-math" % "0.11.1"
>> > >
>> > > libraryDependencies += "org.apache.mahout" % "mahout-spark_2.10" %
>> > "0.11.1"
>> > >
>> > > along with other spark dependencies
>> > >
>> > > and in App1.scala, in the main function, I construct a context object
>> > using
>> > > mahoutSparkContext, and of course, the sparkbindings are imported
>> > >
>> > > everything compiles successfully
>> > >
>> > > however, when I submit to spark, I get the above mentioned error.
>> > >
>> > > I have a general idea of why this is happening: because the compiled
>> app1
>> > > jar depends on mahout-spark dependency jar but it cannot find it in
>> the
>> > > class path upon being submitted to spark.
>> > >
>> > > In the instructions I couldn't find how to explicitly add the
>> > mahout-spark
>> > > dependency jar to the class path.
>> > >
>> > > The question is: Am I doing the configurations correctly or not?
>> > >
>> > > Sorry for the lengthy email
>> > >
>> > > Kind Regards,
>> > > Bahaa
>> > >
>> >
>>
>
>

Reply via email to