Github user kiszk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13909#discussion_r88621571
  
    --- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/ExpressionEvalHelper.scala
 ---
    @@ -42,15 +42,55 @@ trait ExpressionEvalHelper extends 
GeneratorDrivenPropertyChecks {
         
InternalRow.fromSeq(values.map(CatalystTypeConverters.convertToCatalyst))
       }
     
    +  protected def convertToCatalystUnsafe(a: Any): Any = a match {
    +    case arr: Array[Boolean] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case arr: Array[Byte] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case arr: Array[Short] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case arr: Array[Int] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case arr: Array[Long] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case arr: Array[Float] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case arr: Array[Double] => UnsafeArrayData.fromPrimitiveArray(arr)
    +    case other => CatalystTypeConverters.convertToCatalyst(other)
    +  }
    +
       protected def checkEvaluation(
           expression: => Expression, expected: Any, inputRow: InternalRow = 
EmptyRow): Unit = {
         val serializer = new JavaSerializer(new SparkConf()).newInstance
         val expr: Expression = 
serializer.deserialize(serializer.serialize(expression))
    -    val catalystValue = CatalystTypeConverters.convertToCatalyst(expected)
    +    // No codegen version expects GenericArrayData
    +    val catalystValue = expected match {
    +      case arr: Array[Byte] if expression.dataType == BinaryType => arr
    +      case arr: Array[_] => new 
GenericArrayData(arr.map(CatalystTypeConverters.convertToCatalyst))
    --- End diff --
    
    I did the followings. However, `LiteralExpressionSuite."binary literals" 
produces the same exception.
    
    - changed `Array[Any]` to `Array[_]` in 
CatalystTypeConverters.convertToCatalyst 
    - remove line 63 in `ExpressionEvalHelper.scala`
    - changed line 62 `ExpressionEvalHelper.scala` to `case arr: Array[Byte] => 
arr`
    
    The set of these changes also creates another exception at 
`ExpressionToSQLSuite.string function`
    ```java
    java.lang.ClassCastException: 
org.apache.spark.sql.catalyst.util.GenericArrayData cannot be cast to [B
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(generated.java:31)
        at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
        at 
org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
    ...
    ```
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to