another option for artifact names (using jars for example here):

mahout-spark-2.11_2.10-0.13.1.jar
mahout-spark-2.11_2.11-0.13.1.jar
mahout-math-scala-2.11_2.10-0.13.1.jar


i.e. <module>-<spark version>-<scala version>-<mahout-version>.jar


not exactly pretty.. I somewhat prefer Trevor's idea of Dl4j convention.

________________________________
From: Trevor Grant <trevor.d.gr...@gmail.com>
Sent: Friday, July 7, 2017 11:57:53 AM
To: Mahout Dev List; u...@mahout.apache.org
Subject: [DISCUSS] Naming convention for multiple spark/scala combos

Hey all,

Working on releasing 0.13.1 with multiple spark/scala combos.

Afaik, there is no 'standard' for multiple spark versions (but I may be
wrong, I don't claim expertise here).

One approach is simply only release binaries for:
Spark-1.6 + Scala 2.10
Spark-2.1 + Scala 2.11

OR

We could do like dl4j

org.apache.mahout:mahout-spark_2.10:0.13.1_spark_1
org.apache.mahout:mahout-spark_2.11:0.13.1_spark_1

org.apache.mahout:mahout-spark_2.10:0.13.1_spark_2
org.apache.mahout:mahout-spark_2.11:0.13.1_spark_2

OR

some other option I don't know of.

Reply via email to