On Wed, Aug 10, 2016 at 12:04 AM, Kazuaki Ishizaki <ishiz...@jp.ibm.com> wrote:
> import testImplicits._ > test("test") { > val ds1 = sparkContext.parallelize(Seq(Array(1, 1), Array(2, 2), > Array(3, 3)), 1).toDS You should just Seq(...).toDS > val ds2 = ds1.map(e => e) Why are you e => e (since it's identity) and does nothing? > .as(RowEncoder(new StructType() > .add("value", ArrayType(IntegerType, false), nullable = false))) I didn't know it's possible but looks like it's toDF where you could replace the schema too (in a less involved way). I learnt quite a lot from just a single email. Thanks! Pozdrawiam, Jacek --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org