Re: Status of 2.11 support?

2015-11-11 Thread Ted Yu
I started playing with Scala 2.12.0-M3 but the compilation didn't pass (as
expected)

Planning to get back to 2.12 once it is released.

FYI

On Wed, Nov 11, 2015 at 4:34 PM, Jakob Odersky  wrote:

> Hi Sukant,
>
> Regarding the first point: when building spark during my daily work, I
> always use Scala 2.11 and have only run into build problems once. Assuming
> a working build I have never had any issues with the resulting artifacts.
>
> More generally however, I would advise you to go with Scala 2.11 under all
> circumstances. Scala 2.10 has reached end-of-life and, from what I make out
> of your question, you have the opportunity to switch to a newer technology,
> so why stay with legacy? Furthermore, Scala 2.12 will be coming out early
> next year, so I reckon that Spark will switch to Scala 2.11 by default
> pretty soon*.
>
> regards,
> --Jakob
>
> *I'm myself pretty new to the Spark community so please don't take my
> words on it as gospel
>
>
> On 11 November 2015 at 15:25, Ted Yu  wrote:
>
>> For #1, the published jars are usable.
>> However, you should build from source for your specific combination of
>> profiles.
>>
>> Cheers
>>
>> On Wed, Nov 11, 2015 at 3:22 PM, shajra-cogscale <
>> sha...@cognitivescale.com> wrote:
>>
>>> Hi,
>>>
>>> My company isn't using Spark in production yet, but we are using a bit of
>>> Scala.  There's a few people who have wanted to be conservative and keep
>>> our
>>> Scala at 2.10 in the event we start using Spark.  There are others who
>>> want
>>> to move to 2.11 with the idea that by the time we're using Spark it will
>>> be
>>> more or less 2.11-ready.
>>>
>>> It's hard to make a strong judgement on these kinds of things without
>>> getting some community feedback.
>>>
>>> Looking through the internet I saw:
>>>
>>> 1) There's advice to build 2.11 packages from source -- but also
>>> published
>>> jars to Maven Central for 2.11.  Are these jars on Maven Central usable
>>> and
>>> the advice to build from source outdated?
>>>
>>> 2)  There's a note that the JDBC RDD isn't 2.11-compliant.  This is okay
>>> for
>>> us, but is there anything else to worry about?
>>>
>>> It would be nice to get some answers to those questions as well as any
>>> other
>>> feedback from maintainers or anyone that's used Spark with Scala 2.11
>>> beyond
>>> simple examples.
>>>
>>> Thanks,
>>> Sukant
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Status-of-2-11-support-tp25362.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>


Re: Status of 2.11 support?

2015-11-11 Thread Jakob Odersky
Hi Sukant,

Regarding the first point: when building spark during my daily work, I
always use Scala 2.11 and have only run into build problems once. Assuming
a working build I have never had any issues with the resulting artifacts.

More generally however, I would advise you to go with Scala 2.11 under all
circumstances. Scala 2.10 has reached end-of-life and, from what I make out
of your question, you have the opportunity to switch to a newer technology,
so why stay with legacy? Furthermore, Scala 2.12 will be coming out early
next year, so I reckon that Spark will switch to Scala 2.11 by default
pretty soon*.

regards,
--Jakob

*I'm myself pretty new to the Spark community so please don't take my words
on it as gospel


On 11 November 2015 at 15:25, Ted Yu  wrote:

> For #1, the published jars are usable.
> However, you should build from source for your specific combination of
> profiles.
>
> Cheers
>
> On Wed, Nov 11, 2015 at 3:22 PM, shajra-cogscale <
> sha...@cognitivescale.com> wrote:
>
>> Hi,
>>
>> My company isn't using Spark in production yet, but we are using a bit of
>> Scala.  There's a few people who have wanted to be conservative and keep
>> our
>> Scala at 2.10 in the event we start using Spark.  There are others who
>> want
>> to move to 2.11 with the idea that by the time we're using Spark it will
>> be
>> more or less 2.11-ready.
>>
>> It's hard to make a strong judgement on these kinds of things without
>> getting some community feedback.
>>
>> Looking through the internet I saw:
>>
>> 1) There's advice to build 2.11 packages from source -- but also published
>> jars to Maven Central for 2.11.  Are these jars on Maven Central usable
>> and
>> the advice to build from source outdated?
>>
>> 2)  There's a note that the JDBC RDD isn't 2.11-compliant.  This is okay
>> for
>> us, but is there anything else to worry about?
>>
>> It would be nice to get some answers to those questions as well as any
>> other
>> feedback from maintainers or anyone that's used Spark with Scala 2.11
>> beyond
>> simple examples.
>>
>> Thanks,
>> Sukant
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Status-of-2-11-support-tp25362.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: Status of 2.11 support?

2015-11-11 Thread Ted Yu
For #1, the published jars are usable.
However, you should build from source for your specific combination of
profiles.

Cheers

On Wed, Nov 11, 2015 at 3:22 PM, shajra-cogscale 
wrote:

> Hi,
>
> My company isn't using Spark in production yet, but we are using a bit of
> Scala.  There's a few people who have wanted to be conservative and keep
> our
> Scala at 2.10 in the event we start using Spark.  There are others who want
> to move to 2.11 with the idea that by the time we're using Spark it will be
> more or less 2.11-ready.
>
> It's hard to make a strong judgement on these kinds of things without
> getting some community feedback.
>
> Looking through the internet I saw:
>
> 1) There's advice to build 2.11 packages from source -- but also published
> jars to Maven Central for 2.11.  Are these jars on Maven Central usable and
> the advice to build from source outdated?
>
> 2)  There's a note that the JDBC RDD isn't 2.11-compliant.  This is okay
> for
> us, but is there anything else to worry about?
>
> It would be nice to get some answers to those questions as well as any
> other
> feedback from maintainers or anyone that's used Spark with Scala 2.11
> beyond
> simple examples.
>
> Thanks,
> Sukant
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Status-of-2-11-support-tp25362.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Status of 2.11 support?

2015-11-11 Thread shajra-cogscale
Hi,

My company isn't using Spark in production yet, but we are using a bit of
Scala.  There's a few people who have wanted to be conservative and keep our
Scala at 2.10 in the event we start using Spark.  There are others who want
to move to 2.11 with the idea that by the time we're using Spark it will be
more or less 2.11-ready.

It's hard to make a strong judgement on these kinds of things without
getting some community feedback.

Looking through the internet I saw:

1) There's advice to build 2.11 packages from source -- but also published
jars to Maven Central for 2.11.  Are these jars on Maven Central usable and
the advice to build from source outdated?

2)  There's a note that the JDBC RDD isn't 2.11-compliant.  This is okay for
us, but is there anything else to worry about?

It would be nice to get some answers to those questions as well as any other
feedback from maintainers or anyone that's used Spark with Scala 2.11 beyond
simple examples.

Thanks,
Sukant



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Status-of-2-11-support-tp25362.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org