microcai wrote:

They can't care how big a windows is in the pixel, but in the inch.

People should have different monitors with different DPI. Windows should
stay same size regardless the DPI.

Force DPI==96 on every monitor is a stupid idea, and we should avoid it
on the protocol side.

The reason this had to be done was due to the incredibly stupid idea that only *fonts* are measured in points, and every other graphic is measured in pixels. This strange idea was on both X and Windows, likely due to the initial programs being terminal emulators where there was no graphics other than text. What this really means is that there are two different coordinate systems for all the graphics, and programmers just assumed these two systems always lined up exactly like they did on their screen.

After a lot of awful looking output on screens with different DPI, both Windows and then X resorted to just forcing the DPI to 96, thus making the systems obey the programmer's assumptions. Bad DPI settings are still a bug on X, producing ridiculous large and tiny font sizes unexpectedly, and this is NEVER wanted.

The correct solution would have been to specify all coordinates in the same units, likely 1 unit in the CTM. For practical reasons on current-day screens this wants to be a pixel by default, but there is no reason a program can't read the DPI and set the CTM to draw actual-size graphics.

I suggest, DPI should also be windows specific, so that compositor can
*scale it*.

If I understand it right, a Wayland window has both a rectangle measured in screen coordinates, and a source image that can be a different size. The compositor is expected to transform the source image (scale it) to fit in the rectangle.
_______________________________________________
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to