Re: Set is not parseable as row field in SparkSql

2015-01-29 Thread Jorge Lopez-Malla
. Because there isn’t an obvious mapping from Set[T] to any SQL types. Currently we have complex types like array, map, and struct, which are inherited from Hive. In your case, I’d transform the Set[T] into a Seq[T] first, then Spark SQL can map it to an array. Cheng On 1/28/15 7:15 AM, Jorge Lopez

Set is not parseable as row field in SparkSql

2015-01-28 Thread Jorge Lopez-Malla
Hello, We are trying to insert a case class in Parquet using SparkSql. When i'm creating the SchemaRDD, that include a Set, i have the following exception: sqc.createSchemaRDD(r) scala.MatchError: Set[(scala.Int, scala.Int)] (of class scala.reflect.internal.Types$TypeRef$$anon$1) at

Accumulables in transformation operations

2014-11-03 Thread Jorge Lopez-Malla
Hi i'm reading the O´really´s book Learning Spark and i have a doubt, the accumulator's fault tolerance still only happening in the actions operations? this behaviour is also expected if we use accumulables? Thank in advance Jorge López-Malla Matute Big Data Developer Avenida de Europa, 26.