​Pat, Dimitri

thanks for your feedback.
I managed to get mahout's spark-shell running but got a stack trace (
java.io.InvalidClassException ) after running


val drmX = drmData(::, 0 until 4)
​​

That's probably because I'm using Spark 1.0.2 . I'll do what Dimitri said
and let you know of the result.
Thanks again.

Good day
Andrea​




On Wed, Aug 13, 2014 at 7:27 PM, Dmitriy Lyubimov <dlie...@gmail.com> wrote:

> email 1 and 2 seem all to be classpath problems.
>
> Make sure spark and mahout are both compiled, Spark version corresponds to
> one in mahout (1.0.1 in the current head), SPARK_HOME and (I think)
> MAHOUT_HOME are set
>
>
> On Wed, Aug 13, 2014 at 7:48 AM, Andrea Abelli <
> andrea.abe...@teralytics.ch>
> wrote:
>
> > Hello again
> >
> > I did some additional fiddling with ./bin/mahout :
> >
> > vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ git diff
> > diff --git a/bin/mahout b/bin/mahout
> > index 5f54181..0174b31 100755
> > --- a/bin/mahout
> > +++ b/bin/mahout
> > @@ -161,7 +161,7 @@ then
> >    fi
> >
> >    # add scala dev target
> > -  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala-*.jar ; do
> > +  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala_*.jar ; do
> >       CLASSPATH=${CLASSPATH}:$f;
> >    done
> >
> > @@ -173,11 +173,11 @@ then
> >        CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> > -    for f in $MAHOUT_HOME/spark/target/mahout-spark-*.jar ; do
> > +    for f in $MAHOUT_HOME/spark/target/mahout-spark_*.jar ; do
> >        CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> > -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ;
> do
> > +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ;
> do
> >         CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> >
> > and got this error when running ./bin/mahout spark-shell
> >
> > Exception in thread "main" java.lang.NoSuchMethodError:
> > org.apache.spark.HttpServer.<init>(Ljava/io/File;)V
> > at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:100)
> >  at
> >
> >
> org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:172)
> > at
> org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:191)
> >  at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:883)
> > at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
> >  at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
> > at
> >
> >
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
> >  at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:31)
> > at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> >
> > I then changed SPARK_HOME from 0.9.1 to spark-1.0.2  and now it seems to
> > work fine.
> >
> > Regards
> > Andrea
> >
> >
> >
> > On Wed, Aug 13, 2014 at 2:19 PM, Andrea Abelli <
> > andrea.abe...@teralytics.ch>
> > wrote:
> >
> > > Hi
> > >
> > > I hope you are well.
> > > While following this tutorial
> > > https://mahout.apache.org/users/sparkbindings/play-with-shell.html
> > > I ran into some problems.
> > > At point 4. of "Starting Mahout's Spark shell", executing `bin/mahout
> > > spark-shell` returns
> > > Error: Could not find or load main class
> > > org.apache.mahout.sparkbindings.shell.Main
> > > so I had a look at classes folder's tree and ./bin/mahout's source
> code.
> > >
> > > vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ ls -l
> > >  $MAHOUT_HOME/spark-shell/target/
> > > total 40
> > > drwxrwxr-x 3 vagrant vagrant  4096 Aug 13 11:18 classes
> > > -rw-rw-r-- 1 vagrant vagrant     1 Aug 13 11:18 classes.timestamp
> > > -rw-rw-r-- 1 vagrant vagrant  3014 Aug 13 11:18
> > > mahout-spark-shell_2.10-1.0-SNAPSHOT-sources.jar
> > > -rw-rw-r-- 1 vagrant vagrant  3132 Aug 13 11:18
> > > mahout-spark-shell_2.10-1.0-SNAPSHOT-tests.jar
> > > -rw-rw-r-- 1 vagrant vagrant 14136 Aug 13 11:18
> > > mahout-spark-shell_2.10-1.0-SNAPSHOT.jar
> > > drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 maven-archiver
> > > drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 test-classes
> > >
> > > while line 180 in ./bin/mahout reads
> > >     for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ;
> > do
> > >
> > > Now, by applying the following diff
> > >
> > > diff --git a/bin/mahout b/bin/mahout
> > > index 5f54181..a6f4ba8 100755
> > > --- a/bin/mahout
> > > +++ b/bin/mahout
> > > @@ -177,7 +177,7 @@ then
> > >        CLASSPATH=${CLASSPATH}:$f;
> > >      done
> > >
> > > -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar
> ;
> > do
> > > +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar
> ;
> > do
> > >         CLASSPATH=${CLASSPATH}:$f;
> > >      done
> > >
> > > I'm now able to get to mahout's shell after running `./bin/mahout
> > > spark-shell`, but I get the following errors
> > >
> > > Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_55)
> > > Type in expressions to have them evaluated.
> > > Type :help for more information.
> > > <console>:9: error: object drm is not a member of package
> > > org.apache.mahout.math
> > >                 @transient implicit val sdc:
> > > org.apache.mahout.math.drm.DistributedContext =
> > >                                                                     ^
> > > <console>:10: error: type SparkDistributedContext is not a member of
> > > package org.apache.mahout.sparkbindings
> > >                    new
> > > org.apache.mahout.sparkbindings.SparkDistributedContext(
> > >                                                        ^
> > > Mahout distributed context is available as "implicit val sdc".
> > > <console>:13: error: not found: value scalabindings
> > >        import scalabindings._
> > >               ^
> > > <console>:13: error: not found: value RLikeOps
> > >        import RLikeOps._
> > >               ^
> > > <console>:13: error: not found: value drm
> > >        import drm._
> > >               ^
> > > <console>:13: error: not found: value RLikeDrmOps
> > >        import RLikeDrmOps._
> > >               ^
> > >
> > > Has anyone any idea of what's going on/wrong? Any hints on what I'm
> doing
> > > wrong or how I could fix this?
> > >
> > > Thanks in advance, and thanks for the awesome project.
> > > Looking forward to participate.
> > >
> > > Regards
> > > Andrea
> > >
> >
> >
> >
> > --
> >
> > Andrea Abelli | TERALYTICS
> > *analytics scientist*
> >
> > Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
> > phone: +353 83 442 44 88
> > email: andrea.abe...@teralytics.ch
> > www.teralytics.net
> >
> > Company registration number: CH-020.3.037.709-7 | Trade register Canton
> > Zurich
> > Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann
> > Küpper
> > Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann
> >
> > This e-mail message contains confidential information which is for the
> sole
> > attention and use of the intended recipient. Please notify us at once if
> > you think that it may not be intended for you and delete it immediately.
> >
>



-- 

Andrea Abelli | TERALYTICS
*analytics scientist*

Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
phone: +353 83 442 44 88
email: andrea.abe...@teralytics.ch
www.teralytics.net

Company registration number: CH-020.3.037.709-7 | Trade register Canton
Zurich
Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann Küpper
Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann

This e-mail message contains confidential information which is for the sole
attention and use of the intended recipient. Please notify us at once if
you think that it may not be intended for you and delete it immediately.

Reply via email to