[ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-----------------------------
    Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns empty stirng.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}







> Support SQL split_part function
> -------------------------------
>
>                 Key: SPARK-38063
>                 URL: https://issues.apache.org/jira/browse/SPARK-38063
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Rui Wang
>            Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns empty stirng.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of 
> the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> h6. Examples
> {code:java}
> > SELECT _FUNC_('11.12.13', '.', 3);
> 13
> > SELECT _FUNC_(NULL, '.', 3);
> NULL
> > SELECT _FUNC_('11.12.13', '', 1);
> '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to