[ https://issues.apache.org/jira/browse/SPARK-16983?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-16983: ---------------------------------- Description: Currently, two-word window functions like `row_number`, `dense_rank`, `percent_rank`, and `cume_dist` are expressed without `_` in error messages. We had better show the correct names. **Before** {code} scala> sql("select row_number()").show java.lang.UnsupportedOperationException: Cannot evaluate expression: rownumber() {code} **After** {code} scala> sql("select row_number()").show java.lang.UnsupportedOperationException: Cannot evaluate expression: row_number() {code} was: Currently, two-word window functions like `row_number`, `dense_rank`, `percent_rank`, and `cume_dist` are expressed without `_`. We had better show the correct name. **Before** {code} scala> sql("select row_number()").show java.lang.UnsupportedOperationException: Cannot evaluate expression: rownumber() {code} **After** {code} scala> sql("select row_number()").show java.lang.UnsupportedOperationException: Cannot evaluate expression: row_number() {code} > Add `prettyName` to row_number, dense_rank, percent_rank, cume_dist > ------------------------------------------------------------------- > > Key: SPARK-16983 > URL: https://issues.apache.org/jira/browse/SPARK-16983 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Dongjoon Hyun > Priority: Minor > > Currently, two-word window functions like `row_number`, `dense_rank`, > `percent_rank`, and `cume_dist` are expressed without `_` in error messages. > We had better show the correct names. > **Before** > {code} > scala> sql("select row_number()").show > java.lang.UnsupportedOperationException: Cannot evaluate expression: > rownumber() > {code} > **After** > {code} > scala> sql("select row_number()").show > java.lang.UnsupportedOperationException: Cannot evaluate expression: > row_number() > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org