[ 
https://issues.apache.org/jira/browse/SPARK-48675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan reassigned SPARK-48675:
-----------------------------------

    Assignee: Nikola Mandic

> Cache table doesn't work with collated column
> ---------------------------------------------
>
>                 Key: SPARK-48675
>                 URL: https://issues.apache.org/jira/browse/SPARK-48675
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 4.0.0
>            Reporter: Nikola Mandic
>            Assignee: Nikola Mandic
>            Priority: Major
>
> Following sequence of queries produces the error:
> {code:java}
> >  cache lazy table t as select col from values ('a' collate utf8_lcase) as 
> > (col);
> > select col from t;
> org.apache.spark.SparkException: not support type: 
> org.apache.spark.sql.types.StringType@1.
>         at 
> org.apache.spark.sql.errors.QueryExecutionErrors$.notSupportTypeError(QueryExecutionErrors.scala:1069)
>         at 
> org.apache.spark.sql.execution.columnar.ColumnBuilder$.apply(ColumnBuilder.scala:200)
>         at 
> org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.$anonfun$next$1(InMemoryRelation.scala:85)
>         at scala.collection.immutable.List.map(List.scala:247)
>         at scala.collection.immutable.List.map(List.scala:79)
>         at 
> org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:84)
>         at 
> org.apache.spark.sql.execution.columnar.DefaultCachedBatchSerializer$$anon$1.next(InMemoryRelation.scala:82)
>         at 
> org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:296)
>         at 
> org.apache.spark.sql.execution.columnar.CachedRDDBuilder$$anon$2.next(InMemoryRelation.scala:293)
> ... {code}
> This is also the problem on non-lazy cached tables.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to