[ 
https://issues.apache.org/jira/browse/SPARK-19754?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15890261#comment-15890261
 ] 

Takeshi Yamamuro commented on SPARK-19754:
------------------------------------------

Aha, I see. I also checked this in v2.0.2;
{code}
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
      /_/
         
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_31)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sql("""SELECT CAST(get_json_object('{"a": 1.6}', '$.a') AS INT)""").show
+---------------------------------------------+
|CAST(get_json_object({"a": 1.6}, $.a) AS INT)|
+---------------------------------------------+
|                                            2|
+---------------------------------------------+
{code}

It seems some patches change this behaviour.

> Casting to int from a JSON-parsed float rounds instead of truncating
> --------------------------------------------------------------------
>
>                 Key: SPARK-19754
>                 URL: https://issues.apache.org/jira/browse/SPARK-19754
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.3, 2.1.0
>            Reporter: Juan Pumarino
>            Priority: Minor
>
> When retrieving a float value from a JSON document, and then casting it to an 
> integer, Hive simply truncates it, while Spark is rounding up when the 
> decimal value is >= 5.
> In Hive, the following query returns {{1}}, whereas in a Spark shell the 
> result is {{2}}.
> {code}
> SELECT CAST(get_json_object('{"a": 1.6}', '$.a') AS INT)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to