On Wed, Sep 28, 2016 at 2:29 PM, Niketan Pansare wrote:
> That is correct. Again, it is a good idea to make the scala version
> explicit either in jar naming or in the release notes.
>
> I am not sure what is the recommended practice in Spark community for
> developing
On Wed, Sep 28, 2016 at 12:55 PM, Niketan Pansare
wrote:
> I think making scala version explicit is a good idea. Implicitly we are
> consistent with spark version supported in the release.
>
>
Implicitly only if the user does not choose to build Spark release with
Scala 2.11,
I think making scala version explicit is a good idea. Implicitly we are
consistent with spark version supported in the release.
Thanks
Niketan
> On Sep 28, 2016, at 12:40 PM, Luciano Resende wrote:
>
> We are currently compiling some scala code inside the SystemML Jar
We are currently compiling some scala code inside the SystemML Jar and we
are currently not really identifying in any way which version it was used
to build the jar.
Should we start prefixing the jar with _2.10 and documenting how to build
with scala 2.11 ?
Also, during release, should we