[ https://issues.apache.org/jira/browse/SPARK-36854?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-36854: ------------------------------------ Assignee: Apache Spark (was: Max Gekk) > Parquet reader fails on load of ANSI interval when off-heap is enabled > ---------------------------------------------------------------------- > > Key: SPARK-36854 > URL: https://issues.apache.org/jira/browse/SPARK-36854 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.3.0 > Reporter: Max Gekk > Assignee: Apache Spark > Priority: Major > > When off-heap column vector is enabled, parquet reader fails. The example > below portraits the issue: > {code:scala} > scala> Seq(java.time.Period.of) > scala> val df = Seq(java.time.Period.ofMonths(1)).toDF > df: org.apache.spark.sql.DataFrame = [value: interval year to month] > scala> df.write.parquet("/Users/maximgekk/tmp/parquet_offheap") > scala> spark.conf.set("spark.sql.columnVector.offheap.enabled", true) > scala> spark.read.parquet("/Users/maximgekk/tmp/parquet_offheap") > res2: org.apache.spark.sql.DataFrame = [value: interval year to month] > scala> spark.read.parquet("/Users/maximgekk/tmp/parquet_offheap").show() > 21/09/25 22:09:03 ERROR Executor: Exception in task 0.0 in stage 3.0 (TID 3) > java.lang.RuntimeException: Unhandled YearMonthIntervalType(0,1) > at > org.apache.spark.sql.execution.vectorized.OffHeapColumnVector.reserveInternal(OffHeapColumnVector.java:562) > at > org.apache.spark.sql.execution.vectorized.OffHeapColumnVector.<init>(OffHeapColumnVector.java:75) > at > org.apache.spark.sql.execution.vectorized.OffHeapColumnVector.allocateColumns(OffHeapColumnVector.java:53) > at > org.apache.spark.sql.execution.vectorized.OffHeapColumnVector.allocateColumns(OffHeapColumnVector.java:42) > {code} > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org