Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20851#discussion_r175317152
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFilters.scala
 ---
    @@ -50,6 +51,15 @@ private[parquet] object ParquetFilters {
           (n: String, v: Any) => FilterApi.eq(
             binaryColumn(n),
             Option(v).map(b => 
Binary.fromReusedByteArray(v.asInstanceOf[Array[Byte]])).orNull)
    +    case DateType if SQLConf.get.parquetFilterPushDownDate =>
    +      (n: String, v: Any) => {
    +        FilterApi.eq(
    +          intColumn(n),
    +          Option(v).map { date =>
    +            val days = date.asInstanceOf[java.sql.Date].getTime / (24 * 60 
* 60 * 1000)
    +            days.toInt.asInstanceOf[Integer]
    --- End diff --
    
    Use `DateTimeUtils.fromJavaDate` here and below?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to