GitHub user cloud-fan opened a pull request: https://github.com/apache/spark/pull/14599
[SPARK-17013][SQL] handle corner case for negative integral literal ## What changes were proposed in this pull request? Spark 2.0 parses negative numeric literals as the unary minus of positive literals. This introduces problems for the edge cases such as -9223372036854775809 being parsed as decimal instead of bigint. This PR fixes it by make negative integral a direct literal in parser. ## How was this patch tested? number-format.sql You can merge this pull request into a Git repository by running: $ git pull https://github.com/cloud-fan/spark bug Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/14599.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #14599 ---- commit 65f6d6ca5c6efad5eb5b80881b1e650fca7d2bec Author: Wenchen Fan <wenc...@databricks.com> Date: 2016-08-11T08:36:05Z corner case for negative literal ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org