Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23132#discussion_r237058331
  
    --- Diff: docs/sql-migration-guide-upgrade.md ---
    @@ -9,6 +9,8 @@ displayTitle: Spark SQL Upgrading Guide
     
     ## Upgrading From Spark SQL 2.4 to 3.0
     
    +  - In Spark version 2.4 and earlier, accepted format of decimals parsed 
from JSON is an optional sign ('+' or '-'), followed by a sequence of zero or 
more decimal digits, optionally followed by a fraction, optionally followed by 
an exponent. Any commas were removed from the input before parsing. Since Spark 
3.0, format varies and depends on locale which can be set via JSON option 
`locale`. The default locale is `en-US`. To switch back to previous behavior, 
set `spark.sql.legacy.decimalParsing.enabled` to `true`.
    --- End diff --
    
    I have the same question. Do we need the `DecimalFormat` when locale is 
`en-US`?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to