Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19651#discussion_r150293329
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcFilters.scala
 ---
    @@ -0,0 +1,180 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.datasources.orc
    +
    +import org.apache.orc.storage.ql.io.sarg.{PredicateLeaf, SearchArgument, 
SearchArgumentFactory}
    +import org.apache.orc.storage.ql.io.sarg.SearchArgument.Builder
    +import org.apache.orc.storage.serde2.io.HiveDecimalWritable
    +
    +import org.apache.spark.sql.sources.Filter
    +import org.apache.spark.sql.types._
    +
    +/**
    + * Utility functions to convert Spark data source filters to ORC filters.
    + */
    +private[orc] object OrcFilters {
    +
    +  /**
    +   * Create ORC filter as a SearchArgument instance.
    +   */
    +  def createFilter(schema: StructType, filters: Seq[Filter]): 
Option[SearchArgument] = {
    +    val dataTypeMap = schema.map(f => f.name -> f.dataType).toMap
    +
    +    // First, tries to convert each filter individually to see whether 
it's convertible, and then
    +    // collect all convertible ones to build the final `SearchArgument`.
    +    val convertibleFilters = for {
    +      filter <- filters
    +      _ <- buildSearchArgument(dataTypeMap, filter, 
SearchArgumentFactory.newBuilder())
    +    } yield filter
    +
    +    for {
    +      conjunction <- 
convertibleFilters.reduceOption(org.apache.spark.sql.sources.And)
    +      builder <- buildSearchArgument(dataTypeMap, conjunction, 
SearchArgumentFactory.newBuilder())
    +    } yield builder.build()
    +  }
    +
    +  /**
    +   * Return true if this is a searchable type in ORC.
    +   */
    +  private def isSearchableType(dataType: DataType) = dataType match {
    +    case ByteType | ShortType | FloatType | DoubleType => true
    +    case IntegerType | LongType | StringType | BooleanType => true
    +    case TimestampType | _: DecimalType => true
    --- End diff --
    
    @cloud-fan . The reason I kept this function in an original form 
[OrcFilters.scala](https://github.com/apache/spark/blob/master/sql/hive/src/main/scala/org/apache/spark/sql/hive/orc/OrcFilters.scala#L82-L89)
 was an easy comparison with old one. Also, I'm aiming to add `DateType` later 
with more test cases.
    > It looks like you randomly group the data types into 3 cases.
    
    I'll try to update.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to