Hi-

Yup, I’ve already done so here:
https://issues.apache.org/jira/browse/SPARK-7944

Please let me know if this requires any more information - more than happy
to provide whatever I can.

Thanks
Alex

On Sun, May 31, 2015 at 8:45 AM, Tathagata Das <[email protected]> wrote:

> Can you file a JIRA with the detailed steps to reproduce the problem?
>
> On Fri, May 29, 2015 at 2:59 AM, Alex Nakos <[email protected]> wrote:
>
>> Hi-
>>
>> I’ve just built the latest spark RC from source (1.4.0 RC3) and can
>> confirm that the spark shell is still NOT working properly on 2.11. No
>> classes in the jar I've specified with the —jars argument on the command
>> line are available in the REPL.
>>
>>
>> Cheers
>> Alex
>>
>> On Thu, May 28, 2015 at 8:38 AM, Tathagata Das <[email protected]>
>> wrote:
>>
>>> Would be great if you guys can test out the Spark 1.4.0 RC2 (RC3 coming
>>> out soon) with Scala 2.11 and report issues.
>>>
>>> TD
>>>
>>> On Tue, May 26, 2015 at 9:15 AM, Koert Kuipers <[email protected]>
>>> wrote:
>>>
>>>> we are still running into issues with spark-shell not working on 2.11,
>>>> but we are running on somewhat older master so maybe that has been resolved
>>>> already.
>>>>
>>>> On Tue, May 26, 2015 at 11:48 AM, Dean Wampler <[email protected]>
>>>> wrote:
>>>>
>>>>> Most of the 2.11 issues are being resolved in Spark 1.4. For a while,
>>>>> the Spark project has published maven artifacts that are compiled with 
>>>>> 2.11
>>>>> and 2.10, although the downloads at
>>>>> http://spark.apache.org/downloads.html are still all for 2.10.
>>>>>
>>>>> Dean Wampler, Ph.D.
>>>>> Author: Programming Scala, 2nd Edition
>>>>> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
>>>>> Typesafe <http://typesafe.com>
>>>>> @deanwampler <http://twitter.com/deanwampler>
>>>>> http://polyglotprogramming.com
>>>>>
>>>>> On Tue, May 26, 2015 at 10:33 AM, Ritesh Kumar Singh <
>>>>> [email protected]> wrote:
>>>>>
>>>>>> Yes, recommended version is 2.10 as all the features are not
>>>>>> supported by 2.11 version. Kafka libraries and JDBC components are yet to
>>>>>> be ported to 2.11 version. And so if your project doesn't depend on these
>>>>>> components, you can give v2.11 a try.
>>>>>>
>>>>>> Here's a link
>>>>>> <https://spark.apache.org/docs/1.2.0/building-spark.html#building-for-scala-211>
>>>>>>  for
>>>>>> building with 2.11 version.
>>>>>>
>>>>>> Though, you won't be running into any issues if you try v2.10 as of
>>>>>> now. But then again, the future releases will have to shift to 2.11 
>>>>>> version
>>>>>> once support for v2.10 ends in the long run.
>>>>>>
>>>>>>
>>>>>> On Tue, May 26, 2015 at 8:21 PM, Punyashloka Biswal <
>>>>>> [email protected]> wrote:
>>>>>>
>>>>>>> Dear Spark developers and users,
>>>>>>>
>>>>>>> Am I correct in believing that the recommended version of Scala to
>>>>>>> use with Spark is currently 2.10? Is there any plan to switch to 2.11 in
>>>>>>> future? Are there any advantages to using 2.11 today?
>>>>>>>
>>>>>>> Regards,
>>>>>>> Punya
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to