thanks for the suggestions @zivanfi. Can definitely make 1 happen. I believe 2
is already covered because SimpleRecordConverter is used to convert nested
types through `SimpleMapRecordConverter` and `SimpleListRecordConverter` which
extend from `SimpleRecordConverter`. Did a sanity check here:
```
case class Bar(x: Int, y: String);case class Foo(x: Bar, y: Int);
org.apache.spark.sql.SparkSession.builder.getOrCreate.createDataset((0 to
1000).map(x => Foo(Bar(1,null), 23))).write.parquet("/tmp/foobar/")
```
```
java -jar target/parquet-tools-1.10.1-SNAPSHOT.jar cat -json
/tmp/foobar/part-00000-35370411-c333-4eb9-9709-64b9d0abe657-c000.snappy.parquet
| head
{"x":{"x":1,"y":null},"y":23}
{"x":{"x":1,"y":null},"y":23}
{"x":{"x":1,"y":null},"y":23}
```
I will formalize this into a unit test.
[ Full content available at: https://github.com/apache/parquet-mr/pull/518 ]
This message was relayed via gitbox.apache.org for [email protected]