[ https://issues.apache.org/jira/browse/SPARK-29776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17004447#comment-17004447 ]
Ankit Raj Boudh edited comment on SPARK-29776 at 12/28/19 10:34 AM: -------------------------------------------------------------------- [~hyukjin.kwon] , as per the discussion SPARK-29776 and SPARK-29853 both Jira status i updated to "Resolved" and resolution i mention like "not a problem" ,if it's correct then will you please assign both Jira to me and then we can closed both Jira. was (Author: ankitraj): [~hyukjin.kwon] , SPARK-29776 and SPARK-29853 both Jira status i updated to "Resolved" and resolution i mention like "not a problem" ,if it's correct then will you please assign both Jira to me and then we can closed both Jira. > rpad and lpad should return NULL when padstring parameter is empty > ------------------------------------------------------------------ > > Key: SPARK-29776 > URL: https://issues.apache.org/jira/browse/SPARK-29776 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.0.0 > Reporter: ABHISHEK KUMAR GUPTA > Priority: Major > > As per rpad definition > rpad > rpad(str, len, pad) - Returns str, right-padded with pad to a length of len > If str is longer than len, the return value is shortened to len characters. > *In case of empty pad string, the return value is null.* > Below is Example > In Spark: > {code} > 0: jdbc:hive2://10.18.19.208:23040/default> SELECT rpad('hi', 5, ''); > +----------------+ > | rpad(hi, 5, ) | > +----------------+ > | hi | > +----------------+ > {code} > It should return NULL as per definition. > > Hive behavior is correct as per definition it returns NULL when pad is empty > String > INFO : Concurrency mode is disabled, not creating a lock manager > {code} > +-------+ > | _c0 | > +-------+ > | NULL | > +-------+ > {code} > > > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org