Ok guys, I've read spark code a bit deeper on serialization side...
You're right, Java & Kryo serialization are runtime only so yes this isn't
really a problem.

A few weeks ago, I've studied a bit how we could integrate Pickling to
spark but currently it's not really possible as Pickling is based on macro.
In this context, types would be meaningful and covariance could have
impacts too.

So for now, I don't see anything against trying RDD[+T].

Pascal

On Sat, Mar 22, 2014 at 9:01 PM, Michael Armbrust <mich...@databricks.com>wrote:

> Hi Pascal,
>
> Thanks for the input.  I think we are going to be okay here since, as Koert
> said, the current serializers use runtime type information.  We could also
> keep at ClassTag around for the original type when the RDD was created.
>  Good things to be aware of though.
>
> Michael
>
> On Sat, Mar 22, 2014 at 12:42 PM, Pascal Voitot Dev <
> pascal.voitot....@gmail.com> wrote:
>
> > On Sat, Mar 22, 2014 at 8:38 PM, David Hall <d...@cs.berkeley.edu>
> wrote:
> >
> > > On Sat, Mar 22, 2014 at 8:59 AM, Pascal Voitot Dev <
> > > pascal.voitot....@gmail.com> wrote:
> > >
> > > > The problem I was talking about is when you try to use typeclass
> > > converters
> > > > and make them contravariant/covariant for input/output. Something
> like:
> > > >
> > > > Reader[-I, +O] { def read(i:I): O }
> > > >
> > > > Doing this, you soon have implicit collisions and philosophical
> > concerns
> > > > about what it means to serialize/deserialize a Parent class and a
> Child
> > > > class...
> > > >
> > >
> > >
> > > You should (almost) never make a typeclass param contravariant. It's
> > almost
> > > certainly not what you want:
> > >
> > > https://issues.scala-lang.org/browse/SI-2509
> > >
> > > -- David
> > >
> >
> > I confirm that it's a pain and I must say I never do it but I've
> inherited
> > historical code that did it :)
> >
>

Reply via email to