[ 
https://issues.apache.org/jira/browse/SPARK-36229?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17384610#comment-17384610
 ] 

dgd_contributor commented on SPARK-36229:
-----------------------------------------

thanks, I will look into this

 

> conv() inconsistently handles invalid strings with > 64 invalid characters
> --------------------------------------------------------------------------
>
>                 Key: SPARK-36229
>                 URL: https://issues.apache.org/jira/browse/SPARK-36229
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Tim Armstrong
>            Priority: Major
>
> SPARK-33428 fixed ArrayIndexOutofBoundsException but introduced a new 
> inconsistency in behaviour where the returned value is different above the 64 
> char threshold.
>  
> {noformat}
> scala> spark.sql("select conv(repeat('?', 64), 10, 16)").show
> +---------------------------+
> |conv(repeat(?, 64), 10, 16)|
> +---------------------------+
> |                          0|
> +---------------------------+
> scala> spark.sql("select conv(repeat('?', 65), 10, 16)").show
> +---------------------------+
> |conv(repeat(?, 65), 10, 16)|
> +---------------------------+
> |           FFFFFFFFFFFFFFFF|
> +---------------------------+
> scala> spark.sql("select conv(repeat('?', 65), 10, -16)").show
> +----------------------------+
> |conv(repeat(?, 65), 10, -16)|
> +----------------------------+
> |                          -1|
> +----------------------------+
> scala> spark.sql("select conv(repeat('?', 64), 10, -16)").show
> +----------------------------+
> |conv(repeat(?, 64), 10, -16)|
> +----------------------------+
> |                           0|
> +----------------------------+{noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to