cloud-fan commented on a change in pull request #31284: URL: https://github.com/apache/spark/pull/31284#discussion_r568854479
########## File path: sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetIOSuite.scala ########## @@ -1205,6 +1205,32 @@ class ParquetIOSuite extends QueryTest with ParquetTest with SharedSparkSession } } } + + test("SPARK-34167: read LongDecimals with precision < 10, VectorizedReader off") { + // decimal32-written-as-64-bit.snappy.parquet was generated using a 3rd-party library. It has + // 10 rows of Decimal(9, 1) written as LongDecimal instead of an IntDecimal + readParquetFile(testFile("test-data/decimal32-written-as-64-bit.snappy.parquet"), false) { + df => assert(10 == df.collect().length) + } + // decimal32-written-as-64-bit-dict.snappy.parquet was generated using a 3rd-party library. It + // has 2048 rows of Decimal(3, 1) written as LongDecimal instead of an IntDecimal + readParquetFile(testFile("test-data/decimal32-written-as-64-bit-dict.snappy.parquet"), false) { + df => assert(2048 == df.collect().length) + } + } + + test("SPARK-34167: read LongDecimals with precision < 10, VectorizedReader on") { + // decimal32-written-as-64-bit.snappy.parquet was generated using a 3rd-party library. It has + // 10 rows of Decimal(9, 1) written as LongDecimal instead of an IntDecimal + readParquetFile(testFile("test-data/decimal32-written-as-64-bit.snappy.parquet")) { df => Review comment: Why does the generated parquet file have the special spark metadata key? Someone else also mentioned it before: https://github.com/rapidsai/cudf/pull/3555#discussion_r355241948 ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org