On Tue, 2023-12-12 at 12:38 -0800, Philip Race wrote:
> Usage of hinting in UIs is on the way out.
> macOS stopped applying hints ages ago. Apple also canned LCD text.
> High DPI displays are obsoleting the raison d'etre of both of these.

High DPI displays might be the future, but they're not the present yet
and the numbers don't support the idea that the market is changing all
that much.

Looking at the numbers that exist, the standard 1080p display is still
a significant portion of the marketplace:

https://gs.statcounter.com/screen-resolution-stats/desktop/worldwide

I would love to live in a world where everyone has such a high
resolution display that we can turn off AA entirely, but we don't live
in that one yet.

> A big problem with hinting is that it distorts both the size and the 
> shape, so UIs do not scale evenly

Can you give an example of UIs scaling unevenly, or animations looking
jerky? All other non-JavaFX UI applications on my system evidently use
hinting, and I don't see anything recognizable there. I'm unclear on
how this could affect scaling given that glyphs are rendered once and
then stuck into a texture atlas on the GPU; a glyph rendered without
hinting is going to be just as static as a glyph rendered with hinting,
except that the glyph rendered with hinting is going to look "better"
(assuming non-broken hinting in the font file). Do you mean that the
glyphs are continuously regenerated during some kind of scaling
animation operation?

> Another is that a poorly hinted font is far worse than no hinting at
all.

Which should be decided on a font-by-font basis, which I think the
developer bundling the font is best equipped to make decisions about.

-- 
Mark Raynsford | https://www.io7m.com

Reply via email to