Just realized that, of course, objects can't be generic, but how do I
create a generic AccumulatorParam?

2014-10-01 12:33 GMT+02:00 Johan Stenberg <[email protected]>:

> Hi,
>
> I have a problem with using accumulators in Spark. As seen on the Spark
> website, if you want custom accumulators you can simply extend (with an
> object) the AccumulatorParam trait. The problem is that I need to make that
> object generic, such as this:
>
>     object SeqAccumulatorParam[B] extends AccumulatorParam[Seq[B]] {
>
>         override def zero(initialValue: Seq[B]): Seq[B] = Seq[B]()
>
>         override def addInPlace(s1: Seq[B], s2: Seq[B]): Seq[B] = s1 ++ s2
>
>     }
>
> But this gives me a compile error because objects can't use generic
> parameters. My situation doesn't really allow me to define a
> SeqAccumulatorParam for each given type since that would lead to a lot of
> ugly code duplication.
>
> I have an alternative method, just placing all of the results in an RDD
> and then later iterating over them with an accumulator, defined for that
> single type, but this would be much nicer.
>
> My question is: is there any other way to create accumulators or some
> magic for making generics and objects work?
>
> Cheers,
>
> Johan
>

Reply via email to