Thanks Trevor,
this encoding leaves the Scala version hard coded. But this is an appreciated
clue and will get me going. There may be a way to use the %% with this or just
explicitly add the scala version string.
@Hoa, I plan to update that repo.
On Oct 3, 2017, at 1:26 PM, Trevor Grant
The spark is included via maven classifier-
the sbt line should be
libraryDependencies += "org.apache.mahout" % "mahout-spark_2.11" %
"0.13.1-SNAPSHOT" classifier "spark_2.1"
On Tue, Oct 3, 2017 at 2:55 PM, Pat Ferrel wrote:
> I’m the aforementioned pferrel
>
> @Hoa,
Actually if you require scala 2.11 and spark 2.1 you have to use the current
master (o.13.0 does not support these) and also can’t use sbt, unless you have
some trick I haven’t discovered.
On Oct 3, 2017, at 12:55 PM, Pat Ferrel wrote:
I’m the aforementioned pferrel
I’m the aforementioned pferrel
@Hoa, thanks for that reference, I forgot I had that example. First don’t use
the Hadoop part of Mahout, it is not supported and will be deprecated. The
Spark version of cooccurrence will be supported. You find it in the
SimilarityAnalysis object.
If you go back
Code pointer:
https://github.com/rawkintrevo/cylons/tree/master/eigenfaces
However, I build Mahout (0.13.1-SNAPSHOT) locally with
mvn clean install -Pscala-2.11,spark-2.1,viennacl-omp -DskipTests
That's how maven was able to pick those up.
On Fri, Sep 22, 2017 at 10:06 PM, Hoa Nguyen
Hey- sorry for long delay. I've been traveling.
Pat Ferrel was telling me he was having some simlar issues with
Spark+Mahout+SBT recently, and that we need to re-examine our naming
conventions on JARs.
Fwiw- I have several project that use Spark+Mahout in Spark 2.1/Scala-2.11,
and we even test
Hey all,
Thanks for the offers of help. I've been able to narrow down some of the
problems to version incompatibility and I just wanted to give an update.
Just to back track a bit, my initial goal was to run Mahout on a
distributed cluster whether that was running Hadoop Map Reduce or Spark.
I
Hi Hoa,
A few things could be happening here, I haven't run across that specific
error.
1) Spark 2.x - Mahout 0.13.0: Mahout 0.13.0 WILL run on Spark 2.x, however
you need to build from source (not the binaries). You can do this by
downloading mahout source or cloning the repo and building