Folks:

> Likewise the idea that HiDPI displays are always "2x" seems to me another
> inelegant hack.  Actually the DPI varies between devices, so high-resolution
> art should not always need to be exactly 2x the normal size.  It may be
> convenient, but it's not the kind of "solution" we can expect to last very 
> long.
> I wouldn't be surprised if Apple themselves changes their tune later.

  At a minimum, we already have the situation where the iPhone4/4s/5, New
  iPad, and Macbook Pro Retina Display all are "high pixel density" displays
  but all three families have different pixel densities. For a display that
  *MUST* conform to a certain physical limit (e.g., "all characters displayed
  must be 5mm in height", this would require being able to use the
  information about the true and varying pixel density.

                                                   Atlant

From: development-bounces+aschmidt=dekaresearch....@qt-project.org 
[mailto:development-bounces+aschmidt=dekaresearch....@qt-project.org] On Behalf 
Of Rutledge Shawn
Sent: Monday, October 01, 2012 5:58 AM
To: development@qt-project.org
Cc: Ziller Eike; Sorvig Morten
Subject: [Development] resolution independence (was Re: Retina display support)

On Sep 21, 2012 w38, at 10:37 AM, ext Ziller Eike wrote:


but that would be a huge waste of system resource and performance drag when 
running on non-retina system. Are there any better solutions?

Aren't you seeing the window size in pixels as usual? With that available, you 
would have a generic answere for your kind of question.

Well, no. "Pixel" in the Qt world atm means something different than "pixel" in 
the physical world (when talking about Cocoa / Mac).
The integer coordinates in Qt actually are mapped to what Cocoa calls "points" 
which is referring to "logical" coordinate space, not "device" coordinate space.
A HiDPI screen has the same number of "points" as a corresponding non-HiDPI 
screen, but it has a "scale" (of 2). Applications see the same number of points 
when they run on a HiDPI screen as they would on a non-HiDPI screen (--> 
everything has exactly the same physical dimensions when running on different 
screens).
That means that Qt also reports the same dimensions. Rastering for pixmaps is 
also done based on "points".

That distorts the definition of "pixel" rather more than one would expect.

Here's how it's supposed to work (how it already works on Linux and Windows):  
QScreen reports both the logical and physical DPI, and the documentation 
already states that logical DPI determines the size of a "point" for fonts.  
The physical DPI is calculated as the ratio of the configured resolution to the 
physical dimensions of the screen (as reported over the DDC connection from the 
monitor).  Logical DPI can be overridden in the operating system (in the 
display control panel, or on Linux, in xorg.conf or by giving a parameter when 
starting X).  Overriding the logical DPI is the normal way for people to "zoom" 
the screen, for example to get larger fonts if one's vision is impaired.  (Or 
else, people who don't know better might just change the resolution and let the 
scaling hardware zoom it up to fit, which will have a similar effect on logical 
DPI, but makes it blurry too.)  On pre-OSX Macs, 72 DPI was normal, and was 
relatively constant if you bought Apple displays.  But in more recent times 96 
DPI has become normal.  So I think a logical pixel should be defined as 
whatever the user or the OS sets it to be, by setting the logical DPI.  (Maybe 
Qt could have a configurable limit though, in case the OS doesn't provide a way 
to override the logical resolution.)

QScreen on OSX currently has a hard-coded definition of DPI, 72 pixels per 
inch.  This is not accurate on any modern hardware, and I'm planning to change 
it to report actual resolution and logical resolution, just like the other 
platforms.  There are already HiDPI non-Apple displays, for example this from 
2009:  
http://techreport.com/news/16181/sony-intros-wide-expensive-vaio-p-netbook  
which has an 8" display with 1600x768 resolution.  If you run Linux or Windows 
on it, I expect that QScreen will tell you the actual resolution.  Qt is 
supposed to be cross-platform, so it doesn't make sense to do something 
completely different on OSX only.

Likewise the idea that HiDPI displays are always "2x" seems to me another 
inelegant hack.  Actually the DPI varies between devices, so high-resolution 
art should not always need to be exactly 2x the normal size.  It may be 
convenient, but it's not the kind of "solution" we can expect to last very 
long.  I wouldn't be surprised if Apple themselves changes their tune later.

I think for the sake of true resolution independence, we need to extend QML to 
have support for units.  E.g. you should be able to specify

Rectangle {
width: 20mm
height: 10mm
Text {
font.size: 5mm
text: "Hello World"
}
}

font.pixelSize and font.pointSize could even be deprecated then, because every 
supported unit would be OK for every possible dimension: pixels (which would 
probably be logical pixels), millimeters, points, inches, etc.  (Maybe we could 
also have "rpx" or some such to represent actual pixels rather than logical 
pixels.)  The fact that it's a change to the language makes it nontrivial, but 
at least it's the same as what CSS does, and QML was designed to be similar to 
CSS, after all.  Then we can claim that we have true resolution-independence.  
You could specify a rectangle as above, and measure with a ruler on the screen, 
and it should be exactly 2 x 1 cm on every device, as long as the device 
reports its own screen resolution accurately.  It would be the same if you 
print it.  When you are creating a UI, if you want exact sizes you could use 
real-world units, whereas if you want a UI which is scaled in proportion to the 
user's system-wide wishes, you would use logical pixels.

But then it would also make sense to extend the Javascript implementation too, 
so that it's possible to assign numbers with units.  As soon as such unit-value 
types exist, one begins to think it should be possible to do math with them 
too, and have transparent unit conversions whenever necessary.  It would be 
really cool, but it's all-new territory for Javascript (although it has been 
done before in some math-oriented languages).  As a stop-gap until the JS 
extension is done, maybe you could still assign a plain number to a unit-value 
quantity, in which case only the number is changed while the units remain the 
same.



Click 
here<https://www.mailcontrol.com/sr/oYymQq9966TTndxI!oX7Uu4ItyQZZf3fdAp2K5RvKxBqy4DztZojCHCXW4Fm!LGXfJU52lbUPMVICd5VsZuQpQ==>
 to report this email as spam.

________________________________
This e-mail and the information, including any attachments, it contains are 
intended to be a confidential communication only to the person or entity to 
whom it is addressed and may contain information that is privileged. If the 
reader of this message is not the intended recipient, you are hereby notified 
that any dissemination, distribution or copying of this communication is 
strictly prohibited. If you have received this communication in error, please 
immediately notify the sender and destroy the original message.

Thank you.

Please consider the environment before printing this email.
_______________________________________________
Development mailing list
Development@qt-project.org
http://lists.qt-project.org/mailman/listinfo/development

Reply via email to