[ 
https://issues.apache.org/jira/browse/SPARK-46841?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Wenchen Fan resolved SPARK-46841.
---------------------------------
    Fix Version/s: 4.0.0
       Resolution: Fixed

Issue resolved by pull request 46180
[https://github.com/apache/spark/pull/46180]

> Language support for collations
> -------------------------------
>
>                 Key: SPARK-46841
>                 URL: https://issues.apache.org/jira/browse/SPARK-46841
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core, SQL
>    Affects Versions: 4.0.0
>            Reporter: Aleksandar Tomic
>            Assignee: Nikola Mandic
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 4.0.0
>
>
> Languages and localization for collations are supported by ICU library. 
> Collation naming format is as follows:
> {code:java}
> <2-letter language code>[_<4-letter script>][_<3-letter country 
> code>][_specifier_specifier...]{code}
> Locale specifier consists of the first part of collation name (language + 
> script + country). Locale specifiers need to be stable across ICU versions; 
> to keep existing ids and names invariant we introduce golden file will locale 
> table which should case CI failure on any silent changes.
> Currently supported optional specifiers:
>  * CS/CI - case sensitivity, default is case-sensitive; supported by 
> configuring ICU collation levels
>  * AS/AI - accent sensitivity; default is accent-sensitive; supported by 
> configuring ICU collation levels
>  * <unspecified>/LCASE/UCASE - case conversion performed prior to 
> comparisons; supported by internal implementation relying on ICU locale-aware 
> conversions
> User can use collation specifiers in any order except of locale which is 
> mandatory and must go first. There is a one-to-one mapping between collation 
> ids and collation names defined in CollationFactory.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to