On Tue, 16 Dec 2025 18:24:32 GMT, Andy Goryachev <[email protected]> wrote:
>> John Hendrikx has updated the pull request incrementally with one additional >> commit since the last revision: >> >> Add more debug output > > Note: I am using the latest standalone monkey tester > https://github.com/andy-goryachev-oracle/MonkeyTest but a similar issue is > present in the one that's currently in the main repo. @andy-goryachev-oracle Thanks for taking the time to test this. I think I may have a fix, included in this PR (I left the print statements in for now). It was again a Mac specific problem, that I couldn't reproduce on Windows. Luckily I could borrow the mac again (and my stuff was still on it). The difference between Mac and Windows here was the layout flag status on the Scene root after the Window was shown. On Mac it was `DIRTY_BRANCH`, while on Windows is was `NEEDS_LAYOUT`. This means that on Windows, it would fully recalculate the scene's size and adjust it again. On Mac, it would only adjust some child controls, as `DIRTY_BRANCH` indicates "this node is fine, but some child does need layout". How this difference arises is probably somewhere in the peer code or native code. On Windows, some control (or perhaps the Scene) probably receives a signal that it should relayout itself when the Window shows, which bubbles up to the root, and the root gets `NEEDS_LAYOUT`. On Mac this works slightly differently, and this doesn't happen. I don't think we need to look here to resolve the problem, as I think it was still the fault of how sizing when the window is shown is handled. The solution is luckily much simpler. I was already suspicious of the fact that we ask the Scene to "size itself" in **all** situations, even if both the width and height of the Window were set explicitly -- this happens with the somewhat poorly named `SceneHelper.preferredSize(getScene())` call (it doesn't return the preferred size, it **sets** it). This hard sets the Scene to its preferred size, and since on Mac the Scene root is `DIRTY_BRANCH` it would not be recalculated when the Window shows. This could cause both gaps in the UI (on your 1.0 screen), or UI pieces being partially shown (on your 2.0 screen I noticed this). So, the fix is to only call `SceneHelper.preferredSize(getScene())` when one or both dimension are not explicit, since in that case we'll be adjusting the size on the scene for one or both dimensions via the `adjustSize(true)` call. If both dimensions were explicit, then there is no need to size the scene to its preferred size at all (this is the MonkeyTester case when it has stored dimensions). Let me know if this resolves the problem for you as well Andy. ------------- PR Comment: https://git.openjdk.org/jfx/pull/2007#issuecomment-3678894225
