[ https://issues.apache.org/jira/browse/SPARK-10352?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Feynman Liang updated SPARK-10352: ---------------------------------- Description: Running the code: {code} val inputString = "abc" val row = InternalRow.apply(inputString) val unsafeRow = UnsafeProjection.create(Array[DataType](StringType)).apply(row) {code} generates the error: {code} [info] java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String [info] at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getUTF8String(rows.scala:46) ***snip*** {code} was: Running the code: {code:scala} val inputString = "abc" val row = InternalRow.apply(inputString) val unsafeRow = UnsafeProjection.create(Array[DataType](StringType)).apply(row) {code} generates the error: {code} [info] java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String [info] at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getUTF8String(rows.scala:46) ***snip*** {code} > BaseGenericInternalRow.getUTF8String should support java.lang.String > -------------------------------------------------------------------- > > Key: SPARK-10352 > URL: https://issues.apache.org/jira/browse/SPARK-10352 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Feynman Liang > > Running the code: > {code} > val inputString = "abc" > val row = InternalRow.apply(inputString) > val unsafeRow = > UnsafeProjection.create(Array[DataType](StringType)).apply(row) > {code} > generates the error: > {code} > [info] java.lang.ClassCastException: java.lang.String cannot be cast to > org.apache.spark.unsafe.types.UTF8String > [info] at > org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getUTF8String(rows.scala:46) > ***snip*** > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org