sorry here's the whole code

val source =
spark.read.format("parquet").load("/emrdata/sources/very_large_ds")

implicit val mapEncoder =
org.apache.spark.sql.Encoders.kryo[(Any,ArrayBuffer[Row])]

source.map{ row => {
      val key = row(0)
      val buff = new ArrayBuffer[Row]()
      buff += row
      (key,buff)
   }
}

...

On Sun, Feb 26, 2017 at 7:31 AM, Stephen Fletcher <
stephen.fletc...@gmail.com> wrote:

> I'm attempting to perform a map on a Dataset[Row] but getting an error on
> decode when attempting to pass a custom encoder.
>  My code looks similar to the following:
>
>
> val source = spark.read.format("parquet").load("/emrdata/sources/very_
> large_ds")
>
>
>
> source.map{ row => {
>       val key = row(0)
>
>    }
> }
>

Reply via email to