Justin Foster created SPARK-14962: ------------------------------------- Summary: spark.sql.orc.filterPushdown=true breaks DataFrame where functionality Key: SPARK-14962 URL: https://issues.apache.org/jira/browse/SPARK-14962 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 1.5.2 Reporter: Justin Foster
When running spark-shell with the configuration "spark.sql.orc.filterPushdown=true", the DataFrame function where and filter have an error. In particular, "column is not null" fails. Example Code: import sqlContext.implicits._ case class MyData(string_field: String, array_field: Seq[String]) val myDataArray = Array( MyData("foo", Seq("bar")), MyData("foobar", null) ) val myDataDF = sc.parallelize(myDataArray).toDF myDataDF.count // 2 myDataDF.where("array_field is null").count // 1 myDataDF.where("array_field is not null").count // 1 myDataDF.write.format("orc").save("/tmp/mydata.orc") val myLoadedDataDF = sqlContext.read.format("orc").load("/tmp/mydata.orc") myLoadedDataDF.where("array_field is not null").count // 2 myLoadedDataDF.where("array_field is not null").count // 0 incorrect -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org