On Wed, Sep 28, 2016 at 2:29 PM, Niketan Pansare <npan...@us.ibm.com> wrote:

> That is correct. Again, it is a good idea to make the scala version
> explicit either in jar naming or in the release notes.
>
> I am not sure what is the recommended practice in Spark community for
> developing applications.
> - Should one release two jars with explicit scala versions (scala 2.10 and
> scala 2.11) and let user download the correct version OR
> - Only release one jar (by sticking to some rule: "scala version matches
> the default scala version of the supported spark version") and provide
> instructions to compile with different scala version. Spark follows this
> option.
>
> For compiling with different scala version, isn't it as simple as
> providing a flag (-Dscala.version=2.11) to mvn rather than modifying the
> pom itself ?
>
>
There is usually one version of the jar on the distribution (e.g. the
default like _2.10), but then, during release, they make the two versions
published to maven, so in case of building applications, you don't have to
actually have to build the framework with the version of Scala being used.


-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to