[ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-----------------------------
    Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



The following demonstrates more about the new function:


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
      end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}


Examples:
```
      > SELECT _FUNC_('11.12.13', '.', 3);
       13
      > SELECT _FUNC_(NULL, '.', 3);
      NULL
      > SELECT _FUNC_('11.12.13', '', 1);
      '11.12.13'
```


  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



The following demonstrates more about the new function:

`split_part(str, delimiter, partNum)`

This function splits `str` by `delimiter` and return requested part of the 
split (1-based). If any input parameter is NULL, return NULL.

`str` and `delimiter` are the same type as `string`. `partNum` is `integer` type

Examples:
```
      > SELECT _FUNC_('11.12.13', '.', 3);
       13
      > SELECT _FUNC_(NULL, '.', 3);
      NULL
```



> Support SQL split_part function
> -------------------------------
>
>                 Key: SPARK-38063
>                 URL: https://issues.apache.org/jira/browse/SPARK-38063
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Rui Wang
>            Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> The following demonstrates more about the new function:
> {code:java}
> `split_part(str, delimiter, partNum)`
> str: string type
> delimiter: string type
> partNum: Integer type
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the
>       end of the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> Examples:
> ```
>       > SELECT _FUNC_('11.12.13', '.', 3);
>        13
>       > SELECT _FUNC_(NULL, '.', 3);
>       NULL
>       > SELECT _FUNC_('11.12.13', '', 1);
>       '11.12.13'
> ```



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to