Ramón García Fernández created SPARK-44165: ----------------------------------------------
Summary: Exception when reading parquet file with TIME fields Key: SPARK-44165 URL: https://issues.apache.org/jira/browse/SPARK-44165 Project: Spark Issue Type: New Feature Components: SQL Affects Versions: 3.4.1, 3.4.0 Environment: Spark 3.4.0 downloaded from apache.spark.org Also reproduced with latest build. Reporter: Ramón García Fernández Attachments: timeonly.parquet When one reads a parquet file containing TIME fields (either with INT32 or INT64 storage) and exception is thrown. From spark shell {{> val df = spark.read.parquet("timeonly.parquet")}} {color:#de350b}23/06/24 13:24:54 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)/ 1]{color} {color:#de350b}org.apache.spark.sql.AnalysisException: Illegal Parquet type: INT32 (TIME(MILLIS,true)).{color} {color:#de350b} at org.apache.spark.sql.errors.QueryCompilationErrors$.illegalParquetTypeError(QueryCompilationErrors.scala:1762){color} {color:#de350b} at org.apache.spark.sql.execution.datasources.parquet.ParquetToSparkSchemaConverter.illegalType$1(ParquetSchemaConverter.scala:206){color} {color:#de350b} at org.apache.spark.sql.execution.datasources.parquet.ParquetToSparkSchemaConverter.$anonfun$convertPrimitiveField$2(ParquetSchemaConverter.scala:252){color} {color:#de350b} at scala.Option.getOrElse(Option.scala:189){color} {color:#de350b} at org.apache.spark.sql.execution.datasources.parquet.ParquetToSparkSchemaConverter.convertPrimitiveField(ParquetSchemaConverter.scala:224){color} {color:#de350b} at org.apache.spark.sql.execution.datasources.parquet.ParquetToSparkSchemaConverter.convertField(ParquetSchemaConverter.scala:187){color} {color:#de350b} at org.apache.spark.sql.execution.datasources.parquet.ParquetToSparkSchemaConverter.$anonfun$convertInternal$3(ParquetSchemaConverter.scala:147){color} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org