Hi Keerthivasan,

Answers below.

Em ter., 26 de dez. de 2023 às 15:32, Keerthivasan Raghavan <
mail2akas...@gmail.com> escreveu:

> Hi All,
>
> I am a newbie to the desktop/embedded linux graphics and widget toolkit.
>
> The following is a list of questions about how OpenJFX is
> architected/designed/implemented:
>
> * How does openjfx manage the lifecycle of
> windows/surfaces/graphics-context(EGL/OpenGL) to draw into?
> What is the (design and implementation)/architecture of the window system
> abstraction used by OpenJFX?
>   Any links to code snippets inside OpenJFX showing the
> creation/management of an X11-Window/Wayland-Surface would help.
> References:
> * GTK uses GDK(https://docs.gtk.org/gtk4/drawing-model.html) for managing
> windowing abstraction.
> * GLFW, a cross platform (window + graphics context + input) management:
> https://www.glfw.org/
> * Microsoft Windows win32 API:
> https://learn.microsoft.com/en-us/windows/win32/learnwin32/creating-a-window,
>
>
> https://learn.microsoft.com/en-us/windows/win32/api/winuser/nf-winuser-createwindowexa
>
>
Each platform has its implementation.The abstraction layer is called
"glass". In the case of linux, gtk is used. Pure wayland (without XWayland)
is not supported yet - and, from my experience, gtk will probably not be
used.
Look for glass_window.cpp for the Gtk backend.


> * How are frames created using drawing operations?
> What is the abstraction used to express the content of the frame?
> Any code/design links please.
> References:
> * GTK uses GSK(https://docs.gtk.org/gsk4/) for building the scene graph
> that can be rendered as a frame.
> * Windows reference:
> https://learn.microsoft.com/en-us/windows/win32/learnwin32/your-first-direct2d-program
>
>
The rendering abstraction layer is called "prism". I'm not sure how the
scene graph is translated into rendering artifacts.
On Linux, GLX is used to glue to OpenGL on a sublayer of prism called
"es2". If Wayland is implemented, it will have to use EGL.
Windows uses direct3d
Mac will use Metal (I think it's under development).



>
> * How is event management done in OpenGFX ?
> Any code/design links please.
> Reference:
> * GTK uses an main eventloop https://docs.gtk.org/glib/main-loop.html .
>

Desktop events originate from glass and it's platform dependent. Those
events are translated into JavaFX event system.
In the case of Linux it also uses GTK eventloop.


>
> * How are widgets drawn ? How are events dispatched to widgets and how do
> widgets react to events ?
> How is the application widget UI state stored and what is the
> corresponding memory management for storing the UI/widget state ?
> Does the UI state get modeled as a scene graph ?
> Any code/design links please.
> Reference:
> *
> https://learn.microsoft.com/en-us/windows/win32/learnwin32/retained-mode-versus-immediate-mode
>
>
Widgets are drawn using the scene graph. I think a Node is the basic
element (on the public API).
In the case of Linux, javafx draws directly into the window/surface (that
is passed by glass). This is the case of hardware accelerated rendering. If
software is used, it generates a bitmap buffer that is passed to glass. In
the case of linux, cairo is used.



> Please feel free to reply with code links, design docs, wikis or articles
> of the web.
>
> Thank you,
> Keerthivasan Raghavan
>

Reply via email to