A very timely article
http://rahulkavale.github.io/blog/2014/11/16/scrap-your-map-reduce/
Cheers
<k/>
P.S: Now reply to ALL.

On Sun, Nov 23, 2014 at 7:16 PM, Krishna Sankar <ksanka...@gmail.com> wrote:

> Good point.
> On the positive side, whether we choose the most efficient mechanism in
> Scala might not be as important, as the Spark framework mediates the
> distributed computation. Even if there is some declarative part in Spark,
> we can still choose an inefficient computation path that is not apparent to
> the framework.
> Cheers
> <k/>
> P.S: Now Reply to ALL
>
> On Sun, Nov 23, 2014 at 11:44 AM, Ognen Duzlevski <
> ognen.duzlev...@gmail.com> wrote:
>
>> On Sun, Nov 23, 2014 at 1:03 PM, Ashish Rangole <arang...@gmail.com>
>> wrote:
>>
>>> Java or Scala : I knew Java already yet I learnt Scala when I came
>>> across Spark. As others have said, you can get started with a little bit of
>>> Scala and learn more as you progress. Once you have started using Scala for
>>> a few weeks you would want to stay with it instead of going back to Java.
>>> Scala is arguably more elegant and less verbose than Java which translates
>>> into higher developer productivity and more maintainable code.
>>>
>>
>> Scala is arguably more elegant and less verbose than Java. However, Scala
>> is also a complex language with a lot of details and tidbits and one-offs
>> that you just have to remember.  It is sometimes difficult to make a
>> decision whether what you wrote is the using the language features most
>> effectively or if you missed out on an available feature that could have
>> made the code better or more concise. For Spark you really do not need to
>> know that much Scala but you do need to understand the essence of it.
>>
>> Thanks for the good discussion! :-)
>> Ognen
>>
>
>

Reply via email to