[jira] [Commented] (SPARK-9069) Remove unlimited DecimalType

2015-07-22 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14638165#comment-14638165
 ] 

Apache Spark commented on SPARK-9069:
-

User 'davies' has created a pull request for this issue:
https://github.com/apache/spark/pull/7605

> Remove unlimited DecimalType
> 
>
> Key: SPARK-9069
> URL: https://issues.apache.org/jira/browse/SPARK-9069
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Reporter: Davies Liu
>Assignee: Davies Liu
>Priority: Blocker
>
> We should remove DecimalType.Unlimited, because BigDecimal does not really 
> support unlimited precision, especially for division.
> We can have maximum precision as 38 (to match with Hive 0.13+). The default 
> precision and scale could be (38, 18).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-9069) Remove unlimited DecimalType

2015-07-23 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-9069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14639985#comment-14639985
 ] 

Apache Spark commented on SPARK-9069:
-

User 'davies' has created a pull request for this issue:
https://github.com/apache/spark/pull/7634

> Remove unlimited DecimalType
> 
>
> Key: SPARK-9069
> URL: https://issues.apache.org/jira/browse/SPARK-9069
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Reporter: Davies Liu
>Assignee: Davies Liu
>Priority: Blocker
>
> We should remove DecimalType.Unlimited, because BigDecimal does not really 
> support unlimited precision, especially for division.
> We can have maximum precision as 38 (to match with Hive 0.13+). The default 
> precision and scale could be (38, 18).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org