On Tue, 2009-03-10 at 17:40 -0500, Michael Hennebry wrote:
Ideally, the X server has the correct DPIs
and the application is written to use them.
The application can discover the number of pixels in a 12pt font
and enlarge or not depending on the answer and the purpose.
Given that many
Tim:
i.e. 12 point text is the same size whether printed on 2 inches of
paper, or 20 inches of paper.
Tom Horsley:
Absolutely true, and absolutely the point. If you specify a 12 point
font on a 46 1920x1080 display, you will wind up drawing some
random smudge of bits that is indeed able to
On Sun, 2009-03-08 at 11:44 -0600, Petrus de Calguarium wrote:
96x96 should be the default. I don't know why it isn't.
No. The DPI should be set to the values that actually represent the
hardware.
Font sizing, and the like, should be set by picking the font size you
want, not buggering up the
On Tuesday 10 March 2009 11:31:39 Tim wrote:
Anyone who thinks that increasing the resolution *should* create smaller
fonts, or GUI gadgets, has got it extremely wrong. And that includes
all the programmers who stupidly do that.
Except that, if you want to do so, because you want more real
On Tue, 10 Mar 2009 22:01:39 +1030
Tim wrote:
96x96 should be the default. I don't know why it isn't.
No. The DPI should be set to the values that actually represent the
hardware.
Actually, that attitude is the one that is utter nonsense. If you
want to get slavish about actual
Tim:
No. The DPI should be set to the values that actually represent the
hardware.
Tom Horsley
Actually, that attitude is the one that is utter nonsense. If you
want to get slavish about actual representation, then you need to
know the distance of the viewer and specify font sizes by the
On Wed, 11 Mar 2009 02:24:39 +1030
Tim wrote:
i.e. 12 point text is the same size whether printed on 2 inches of
paper, or 20 inches of paper.
Absolutely true, and absolutely the point. If you specify a 12 point
font on a 46 1920x1080 display, you will wind up drawing some
random smudge of
On Tue, 2009-03-10 at 13:27 -0400, Tom Horsley wrote:
Absolutely true, and absolutely the point. If you specify a 12 point
font on a 46 1920x1080 display, you will wind up drawing some
random smudge of bits that is indeed able to fit on a line that
is 12/72 of an inch high, but there aren't
On Tue, 10 Mar 2009 13:35:51 -0430
Patrick O'Callaghan wrote:
This true but it shouldn't be. It's true because the sizes of things in
X are defined in terms of pixels, and it's wrong because 12pt type is
12pt, no matter what medium it's on. It's an absolute size, not a given
number of pixels.
Ideally, the X server has the correct DPIs
and the application is written to use them.
The application can discover the number of pixels in a 12pt font
and enlarge or not depending on the answer and the purpose.
Given that many applications don't do that,
lying about the DPIs is a perfectly
Hello guys, how to configure X server's DPI on Fedora 10?
I have in gnome DPI set to 96DPI, but when i check Xorg.log i see that there
is 75x75 DPI, which is the reason , why my fonts are so blurry.
Thanks for help,
D.
--
fedora-list mailing list
fedora-list@redhat.com
To unsubscribe:
David Hláčik wrote:
Thanks for help,
96x96 should be the default. I don't know why it isn't. I have tried it on
an old 1992 crt monitor and 96x96 worked splendidly, so I don't know what
kind of archaic hardware the present default is set for.
To change, edit /etc/kde/kdm/kdmrc and append '
On Sun, 08 Mar 2009 11:44:44 -0600
Petrus de Calguarium wrote:
To change, edit /etc/kde/kdm/kdmrc and append ' -dpi 96' (no quotes, of
course) to the ServerArgsLocal line.
Which works only if you are using KDM and not GDM.
I've got a long rant on DPI one a website I'm working on
with all my
13 matches
Mail list logo