GitHub user sureshthalamati opened a pull request: https://github.com/apache/spark/pull/15192
[SPARK-14536] [SQL] fix to handle null value in array type column for postgres. ## What changes were proposed in this pull request? JDBC read is failing with NPE due to missing null value check for array data type if the source table has null values in the array type column. For null values Resultset.getArray() returns null. This PR adds null safe check to the Resultset.getArray() value before invoking method on the Array object. ## How was this patch tested? Updated the PostgresIntegration test suite to test null values. Ran docker integration tests on my laptop. You can merge this pull request into a Git repository by running: $ git pull https://github.com/sureshthalamati/spark jdbc_array_null_fix-SPARK-14536 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/15192.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #15192 ---- commit 9eb40dbcdb0894e699a38e6dc4f44dc97408f63c Author: sureshthalamati <suresh.thalam...@gmail.com> Date: 2016-09-22T00:57:05Z fix to jdbc read to handle null values in array data type column ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org