IIRC these all fit sbt’s conventons?
On Jul 7, 2017, at 2:05 PM, Trevor Grant wrote:
So to tie all of this together-
org.apache.mahout:mahout-spark_2.10:0.13.1_spark_1_6
org.apache.mahout:mahout-spark_2.10:0.13.1_spark_2_0
org.apache.mahout:mahout-spark_2.10:0.13.1_spark_2_1
org.apache.mahout
So to tie all of this together-
org.apache.mahout:mahout-spark_2.10:0.13.1_spark_1_6
org.apache.mahout:mahout-spark_2.10:0.13.1_spark_2_0
org.apache.mahout:mahout-spark_2.10:0.13.1_spark_2_1
org.apache.mahout:mahout-spark_2.11:0.13.1_spark_1_6
org.apache.mahout:mahout-spark_2.11:0.13.1_spark_2_0
it would seem 2nd option is preferable if doable. Any option that has most
desirable combinations prebuilt, is preferable i guess. Spark itself also
releases tons of hadoop profile binary variations. so i don't have to build
one myself.
On Fri, Jul 7, 2017 at 8:57 AM, Trevor Grant
wrote:
> Hey a
Hey all,
Working on releasing 0.13.1 with multiple spark/scala combos.
Afaik, there is no 'standard' for multiple spark versions (but I may be
wrong, I don't claim expertise here).
One approach is simply only release binaries for:
Spark-1.6 + Scala 2.10
Spark-2.1 + Scala 2.11
OR
We could do li