https://bugzilla.wikimedia.org/show_bug.cgi?id=37338

--- Comment #19 from fanchy <li3939...@gmail.com> ---
(In reply to comment #18)
I have start a discussion(in Chinese) in wikidata project chat
(http://www.wikidata.org/wiki/Wikidata:%E4%BA%92%E5%8A%A9%E5%AE%A2%E6%A0%88#.E5.BC.BA.E7.83.88.E5.BB.BA.E8.AE.AE.E5.BD.BB.E5.BA.95.E6.B8.85.E9.99.A4zh-cn.2Czh-tw.2Czh-hk.2Czh-sg.2Czh-my.2Czh-mo.E8.AF.AD.E8.A8.80.E4.BB.A3.E7.A0.81)
about the 9 zh language codes, which I think has cause many problems in
Commons. Basically I hope there will be a small number of language codes for
Chinese in multilingual projects like Commons, wikidata.  Even many Chinese
don't know what the 9 zh language codes means. When writing  scripts ,
foreigners always ignore some of the variants or use it in the wrong way and
cause bugs.

> I'm planning to do it later for item pages. But for content pages... It's
> still
> needed.

For the item pages, I think we only need a automatic converter for human
writing not a automatic converter for human reading. For example, when I enter
a zh-hans label it can automatically convert it to zh-hant and store both
zh-hans and zh-hant. This will be very simple even can be done with javascript
or a bot, although it will cause a little redundancy. 

The content pages really need that kind of converter for human reading. Or it
can also be resolved with a converter for human writing as above. I think
redundancy can be toleranted if it is much more simple.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to