Taoufik DACHRAOUI created SPARK-27457: -----------------------------------------
Summary: modify bean encoder to support avro objects Key: SPARK-27457 URL: https://issues.apache.org/jira/browse/SPARK-27457 Project: Spark Issue Type: New Feature Components: SQL Affects Versions: 2.4.1 Reporter: Taoufik DACHRAOUI Currently we modified JavaTypeInference to be able to create encoders for Avro objects; we have now two solutions, the one in this PR and another one using Encoders.bean (fewer code changes); which one is better? how to proceed? we should create another PR for the Encoders.bean solution? The test we used for Encoders.bean is as follows: {code:java} implicit val avroExampleEncoder = Encoders.bean[AvroExample1](classOf[AvroExample1]).asInstanceOf[ExpressionEncoder[AvroExample1]] val input: AvroExample1 = AvroExample1.newBuilder() .setMyarray(List("Foo", "Bar").asJava) .setMyboolean(true) .setMybytes(java.nio.ByteBuffer.wrap("MyBytes".getBytes())) .setMydouble(2.5) .setMyfixed(new Magic("magic".getBytes)) .setMyfloat(25.0F) .setMyint(100) .setMylong(10L) .setMystring("hello") .setMymap(Map( "foo" -> new java.lang.Integer(1), "bar" -> new java.lang.Integer(2)).asJava) .setMymoney(Money.newBuilder().setAmount(100.0F).setCurrency(Currency.EUR).build()) .build() val row: InternalRow = avroExampleEncoder.toRow(input) val output: AvroExample1 = avroExampleEncoder.resolveAndBind().fromRow(row) val ds: Dataset[AvroExample1] = List(input).toDS() println(ds.schema) println(ds.collect().toList) ds.write.format("avro").save("example1") val fooDF = spark.read.format("avro").load("example1") val fooDS = fooDF.as[AvroExample1] println(fooDS.collect().toList) {code} The change is made in the branch [https://github.com/mazeboard/spark/tree/bean-encoder] *with 105 additions and 39 deletions. (excluding the code for tests)* -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org