dongjoon-hyun commented on code in PR #45703: URL: https://github.com/apache/spark/pull/45703#discussion_r1539476481
########## sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetSchemaConverter.scala: ########## @@ -404,6 +401,35 @@ class ParquetToSparkSchemaConverter( } } + private def convertVariantField(groupColumn: GroupColumnIO): ParquetColumn = { + if (groupColumn.getChildrenCount != 2 ) { + // We may allow more than two children in the future, so consider this unsupported. + throw QueryCompilationErrors. + parquetTypeUnsupportedYetError("variant with more than two fields") + } + // Find the binary columns, and validate that they have the correct type. + val valueAndMetadata = Seq("value", "metadata").map { colName => + val idx = (0 until groupColumn.getChildrenCount) + .find(groupColumn.getChild(_).getName == colName) Review Comment: Just a question. Although this is supposed to be light procedure, do we need to use iteration loops like the above (line 411 ~ 413)? I'm just wondering if we can do more easier than this. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org