On Sun, Nov 01, 2015 at 09:48:28PM +0000, Guenter Milde wrote:
> On 2015-11-01, Kornel Benko wrote:

> > Most of these tests were working some time ago. 
> 
> Many of them "by chance": not failing but with incorrect output (Missing
> characters, wrong characters) or "fragile" (working, but would fail by
> addition of one non-ASCII character, say).

Agreed. I don't think "by chance" correctly describes the missing
characters situation, but only the "fragile" documents.

> > We already have many inverted test.
> 
> This also adds to the impression, that this is an area where test
> failures are to be expected for many reasons. I.e. the signal to noise ratio
> is rather bad for XeTeX+TeX fonts and we would be better of without these
> tests.

This might be. I admit I have very little knowledge of XeTeX + TeX
fonts.

> > I am strongly against such policy. First one has to check if the reason is 
> > really
> > babel/polyglossia conflict.
> 
> There are many more reasons, mhchem, babel language files, font problems, ...
> 
> The combination XeTeX + TeX-fonts is not supported by many packages,
> including standard LaTeX packages like "inputenc"! OTOH, it is so obscure
> that it is really not worth wasting much time to work around all the
> problems. Rather treat/document it as: avoid if possible, use at your own
> risk.

I am starting to be open to this idea. If it is true that performing a
simple, correct operation in a document leads to a significant chance of
it breaking compilation of XeTeX with TeX fonts, then it is true that
this could cause a lot of noise. On one hand, our manuals were pretty
darn complicated and compiled just fine with TeX fonts. This might be
partly due to 'luck' but since they use so many packages and ERT and
weird constructs, that would be a lot of luck. I guess the main issue is
the missing characters.

Scott

Reply via email to