The decimal pattern for Arabic/Kuwait contains U+0660 Ù ARABIC-INDIC DIGIT ZERO, apparently for the MinimumInteger part (using the Java DecimalFormat terminology), presumably to select the set of Arabic digits. However, this mechanism does not seem to be part of the Java patterns, so I suspect it was added by CLDR. But the best description I have been able to find is in UTR #35:

The numbers element supplies information for formatting and parsing numbers and currencies. It has three sub-elements: <symbols>, <numbers>, and <currencies>. The data is based on the Java/ICU format. The currency IDs are from [ISO4217]. For more information, including the pattern structure, see [JavaNumbers].
(The last pointer goes to the J2SE 1.4.1 documentation, and Sun says "Products listed on this page have completed the Sun End of Life process. ", btw)

So where can I find the documentation on the use of something other than U+0030 0 DIGIT ZERO in a CLDR number pattern?

Thanks,
Eric.

Reply via email to