Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3208#discussion_r20688295
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SqlParser.scala ---
    @@ -339,18 +339,15 @@ class SqlParser extends AbstractSparkSQLParser {
         | floatLit ^^ { f => Literal(f.toDouble) }
         )
     
    -  private val longMax = BigDecimal(s"${Long.MaxValue}")
    -  private val longMin = BigDecimal(s"${Long.MinValue}")
    -  private val intMax = BigDecimal(s"${Int.MaxValue}")
    -  private val intMin = BigDecimal(s"${Int.MinValue}")
    -
       private def toNarrowestIntegerType(value: String) = {
         val bigIntValue = BigDecimal(value)
     
         bigIntValue match {
    -      case v if v < longMin || v > longMax => v
    -      case v if v < intMin || v > intMax => v.toLong
    -      case v => v.toInt
    +      case v if bigIntValue.isValidByte => v.toByteExact
    +      case v if bigIntValue.isValidShort => v.toShortExact
    +      case v if bigIntValue.isValidInt => v.toIntExact
    +      case v if bigIntValue.isValidLong => v.toLongExact
    +      case v => v
    --- End diff --
    
    Okay, sorry, I realize I initially said this was a good idea.  Thinking 
about it further though I'm not sure if this is actually something we want to 
do.  The memory benefits of picking the smallest possible number representation 
don't really seem to outweigh the added complexity of having to deal with bytes 
everywhere all of a sudden.
    
    Are there any other SQL systems that do this?
    
    To be clear I am in favor of using BigDecimal's `isValidX` instead of our 
hand coded checking for int/long/bigdecimal


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to