GitHub user wangyum opened a pull request: https://github.com/apache/spark/pull/22124
[SPARK-25135][SQL] Insert datasource table may all null when select from view ## What changes were proposed in this pull request? How to reproduce: ```scala val path = "/tmp/spark/parquet" val cnt = 30 spark.range(cnt).selectExpr("cast(id as bigint) as col1", "cast(id as bigint) as col2").write.mode("overwrite").parquet(path) spark.sql(s"CREATE TABLE table1(col1 bigint, col2 bigint) using parquet location '$path'") spark.sql("create view view1 as select col1, col2 from table1 where col1 > -20") spark.sql("create table table2 (COL1 BIGINT, COL2 BIGINT) using parquet") spark.sql("insert overwrite table table2 select COL1, COL2 from view1") spark.table("table2").show ``` The result all `NULL`. this pr fix this issue. ## How was this patch tested? unit tests You can merge this pull request into a Git repository by running: $ git pull https://github.com/wangyum/spark SPARK-25135 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/22124.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #22124 ---- commit 276879ca2bd8d2966b829b7e41e140362c4e4160 Author: Yuming Wang <yumwang@...> Date: 2018-08-16T18:37:18Z insert datasource table may all null when select from view ---- --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org