Hey all,

I'd like to use the Scalaz library in some of my Spark jobs, but am running
into issues where some stuff I use from Scalaz is not serializable. For
instance, in Scalaz there is a trait

/** In Scalaz */
trait Applicative[F[_]] {
  def apply2[A, B, C](fa: F[A], fb: F[B])(f: (A, B) => C): F[C]
  def point[A](a: => A): F[A]
}

But when I try to use it in say, in an `RDD#aggregate` call I get:


Caused by: java.io.NotSerializableException:
scalaz.std.OptionInstances$$anon$1
Serialization stack:
        - object not serializable (class: scalaz.std.OptionInstances$$anon$1,
value: scalaz.std.OptionInstances$$anon$1@4516ee8c)
        - field (class: dielectric.syntax.RDDOps$$anonfun$1, name: G$1, type:
interface scalaz.Applicative)
        - object (class dielectric.syntax.RDDOps$$anonfun$1, <function2>)
        - field (class: dielectric.syntax.RDDOps$$anonfun$traverse$extension$1,
name: apConcat$1, type: interface scala.Function2)
        - object (class dielectric.syntax.RDDOps$$anonfun$traverse$extension$1,
<function2>)

Outside of submitting a PR to Scalaz to make things Serializable, what can I
do to make things Serializable? I considered something like

implicit def applicativeSerializable[F[_]](implicit F: Applicative[F]):
SomeSerializableType[F] =
  new SomeSerializableType { ... } ??

Not sure how to go about doing it - I looked at java.io.Externalizable but
given `scalaz.Applicative` has no value members I'm not sure how to
implement the interface.

Any guidance would be much appreciated - thanks!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Getting-around-Serializability-issues-for-types-not-in-my-control-tp22193.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to