Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/8780#issuecomment-141609708
  
    That should work.
    
    On Sep 18, 2015, at 7:15 AM, Travis Hegner <notificati...@github.com> wrote:
    
    That is exactly what I was afraid of. Would the patch make more sense to
    *only* check precision for a zero value? Does it ever make sense to have a
    precision of zero (or less than zero for that matter)? Could we safely
    enforce defaults if precision is zero (or less) regardless of scale? That
    would solve my problem still, hopefully without compromising functionality
    for everyone else.
    
    —
    Reply to this email directly or view it on GitHub
    <https://github.com/apache/spark/pull/8780#issuecomment-141462484>.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to