GitHub user gatorsmile opened a pull request: https://github.com/apache/spark/pull/9314
[SPARK-11360] [Doc] Loss of nullability when writing parquet files This fix is to add one line to explain the current behavior of Spark SQL when writing Parquet files. All columns are forced to be nullable for compatibility reasons. You can merge this pull request into a Git repository by running: $ git pull https://github.com/gatorsmile/spark lossNull Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/9314.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #9314 ---- commit 4a63fad3b432bcb16d0fa37555574c86112a2425 Author: gatorsmile <gatorsm...@gmail.com> Date: 2015-10-28T01:33:04Z Document fix: loss of nullability when writing parquet files ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org