Eyal Farago created SPARK-16791:
-----------------------------------

             Summary: casting structs fails on Timestamp fields (interpreted 
mode only)
                 Key: SPARK-16791
                 URL: https://issues.apache.org/jira/browse/SPARK-16791
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.0.0, 1.6.1
            Reporter: Eyal Farago
            Priority: Minor


When casting a struct with a Timestamp field, a MatchError is thrown (pasted 
below).
the root cause for this is in 
org.apache.spark.sql.catalyst.expressions.Cast#cast 
(https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala#L419).

case dt if dt == child.dataType => identity[Any]

should be modified to:

case dt if dt == from => identity[Any]

it seems to explode for timestamp because 
org.apache.spark.sql.catalyst.expressions.Cast#castToTimestamp does'nt have an 
identity check or fallback case in its pattern matching.

I'll shortly open a pull request with a failing test case and a fix.

Caused by: scala.MatchError: TimestampType (of class 
org.apache.spark.sql.types.TimestampType$)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.castToTimestamp(Cast.scala:185)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.org$apache$spark$sql$catalyst$expressions$Cast$$cast(Cast.scala:424)
        at 
org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$1.apply(Cast.scala:403)
        at 
org.apache.spark.sql.catalyst.expressions.Cast$$anonfun$1.apply(Cast.scala:402)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.castStruct(Cast.scala:402)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.org$apache$spark$sql$catalyst$expressions$Cast$$cast(Cast.scala:435)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.cast$lzycompute(Cast.scala:443)
        at org.apache.spark.sql.catalyst.expressions.Cast.cast(Cast.scala:443)
        at 
org.apache.spark.sql.catalyst.expressions.Cast.nullSafeEval(Cast.scala:445)
        at 
org.apache.spark.sql.catalyst.expressions.UnaryExpression.eval(Expression.scala:324)
        at 
org.apache.spark.sql.catalyst.expressions.ExpressionEvalHelper$class.evaluate(ExpressionEvalHelper.scala:80)
        at 
org.apache.spark.sql.catalyst.expressions.CastSuite.evaluate(CastSuite.scala:33)
        ... 58 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to