Hi, All.

I made a PR to upgrade Parquet to 1.9.0 for Apache Spark 2.2 on Late March.

- https://github.com/apache/spark/pull/16281

Currently, there occurs some important options about that. Here is the summary.

1. Forking Parquet 1.8.X and maintaining like Spark Hive.
2. Wait and see for Parquet 1.9.x adoption in the other community.
3. Make stronger integration tests including both feature and performance tests

I think we had better inform all of you on dev mailing list because it has an 
option forking.
If you have any opinion, please reply here or on the PR.

BTW, the default decision is always number 2 because we will use Apache Parquet 
1.8.1 for a while.

Bests,
Dongjoon.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to