Robert Joseph Evans created SPARK-41218:
-------------------------------------------

             Summary: ParquetTable reports is supports negative scale decimal 
values
                 Key: SPARK-41218
                 URL: https://issues.apache.org/jira/browse/SPARK-41218
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.0.0, 3.4.0
            Reporter: Robert Joseph Evans


This is likely very minor, but {{ParquetTable}} says it supports all 
{{AtomicTypes}}

https://github.com/apache/spark/blob/07427b854be58810bd485c00c5e5c576d5aa404e/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/parquet/ParquetTable.scala#L52

But the ParquetSpec and code says that negative scale decimal values are not 
supported, and the code will throw an exception if you try to store one.

{code}
 java.lang.IllegalArgumentException: Invalid DECIMAL scale: -2
        at org.apache.parquet.Preconditions.checkArgument(Preconditions.java:57)
        at 
org.apache.parquet.schema.Types$BasePrimitiveBuilder.decimalMetadata(Types.java:616)
        at 
org.apache.parquet.schema.Types$BasePrimitiveBuilder.build(Types.java:443)
        at 
org.apache.parquet.schema.Types$BasePrimitiveBuilder.build(Types.java:338)

{code}

We should update the ParquetTable code to be accurate in this respect.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to