Thanks Michael,
That is one solution that I had thought of. It seems like a bit of
overkill for the few methods I want to do this for - but I will think
about it. I guess I was hoping that I was missing something more
obvious/easier.
Philip
On 07/21/2014 11:20 AM, Michael Malak wrote:
It
heya,
Without a bit of gymnastic at the type level, nope. Actually RDD doesn't
share any functions with the scala lib (the simple reason I could see is
that the Spark's ones are lazy, the default implementations in Scala
aren't).
However, it'd be possible by implementing an implicit converter fro
It's really more of a Scala question than a Spark question, but the standard OO
(not Scala-specific) way is to create your own custom supertype (e.g.
MyCollectionTrait), inherited/implemented by two concrete classes (e.g. MyRDD
and MyArray), each of which manually forwards method calls to the co
It is really nice that Spark RDD's provide functions that are often
equivalent to functions found in Scala collections. For example, I can
call:
myArray.map(myFx)
and equivalently
myRdd.map(myFx)
Awesome!
My question is this. Is it possible to write code that works on either
an RDD or a