[ https://issues.apache.org/jira/browse/SPARK-40109?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17580501#comment-17580501 ]
Apache Spark commented on SPARK-40109: -------------------------------------- User 'gengliangwang' has created a pull request for this issue: https://github.com/apache/spark/pull/37541 > New SQL function: get() > ----------------------- > > Key: SPARK-40109 > URL: https://issues.apache.org/jira/browse/SPARK-40109 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.4.0 > Reporter: Gengliang Wang > Assignee: Gengliang Wang > Priority: Major > > Currently, when accessing array element with invalid index under ANSI SQL > mode, the error is like: > {quote}[INVALID_ARRAY_INDEX] The index -1 is out of bounds. The array has 3 > elements. Use `try_element_at` and increase the array index by 1(the starting > array index is 1 for `try_element_at`) to tolerate accessing element at > invalid index and return NULL instead. If necessary set > "spark.sql.ansi.enabled" to "false" to bypass this error. > {quote} > The provided solution is complicated. I suggest introducing a new method > get() which always returns null on an invalid array index. This is from > [https://docs.snowflake.com/en/sql-reference/functions/get.html.] > Since Spark's map access always returns null, let's don't support map type in > the get method for now. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org