On 05/12/2022 23:05, Jean-Christophe Helary wrote:
What's the best way in a DocBook centered process to ensure that the
list of terms used in a software UI is (semi-automatically?) taken
into account in the DocBook sources that describe that software?
I haven't had to do this, but since no-one else has responded yet...
Problem at hand:
- a Java application with ~2k UI strings (not all users facing), in
a Bundle.properties file
Java also has an XML format for properties files.
- a ~80K words DocBook manual
It is not trivial to keep track of the whole string set (searches,
etc.)
Also, the l10n process takes place on the DocBook sources, not on
the HTML output, so tricks like <link linkend endterm/> don't work
because translators don't see the target terms.
Before translation, replace each <link/> with the replacement text from
the XML properties file wrapped in a well-known element that still
carries the identifier for the properties file entry.
After translation, if necessary, convert the well-known elements back
into <link/> and also do something to handle the strings that have been
translated differently in different places.
Once you have the properties file for a second language, you could
insert the translated strings in place of <link/> when preparing for
translation. Alternatively, or as well, you could set up your
computer-aided translation tool to not translate the well-known elements
for the strings and insert the translated strings after everything else
is translated.
I'm left with having to rewrite the strings explicitly and that's a
pain, and also adds risks of mistakes in translations.
The more that you can automate, the better.
Regards,
Tony Graham.
--
Senior Architect
XML Division
Antenna House, Inc.
----
Skerries, Ireland
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]