XDG_decoration protocol questions
I have added to my MGRX Wayland videodriver (mgrx.fgrim.com) support for the XDG_decoration protocol to have server side window decorations. After doing the wayland-scanner magic to generate the .h include and the .c glue code Adding the include: #include "xdg-decoration-client-protocol.h" Bind the global register: } else if (strcmp(interface, zxdg_decoration_manager_v1_interface.name) == 0) { state->zxdg_decoration_manager_v1 = wl_registry_bind( wl_registry, name, &zxdg_decoration_manager_v1_interface, 1); //fprintf(stderr, "zxdg_decoration_manager_v1 detected\n"); } After getting the xdg_surface get the zxdg_toplevel_decoration_v1 object and add the listener: /* if the compositor support it (like the KDE one) ask for server side decoration */ if (_WGrState.zxdg_decoration_manager_v1) { _WGrState.zxdg_toplevel_decoration_v1 = zxdg_decoration_manager_v1_get_toplevel_decoration( _WGrState.zxdg_decoration_manager_v1, _WGrState.xdg_toplevel); zxdg_toplevel_decoration_v1_add_listener(_WGrState.zxdg_toplevel_decoration_v1, &zxdg_toplevel_decoration_v1_listener, &_WGrState); } And declare the listener: static void toplevel_decoration_v1_configure(void *data, struct zxdg_toplevel_decoration_v1 *zxdg_toplevel_decoration_v1, uint32_t mode) { //fprintf(stderr, "zxdg_decoration_manager_v1 mode %d\n", mode); } static const struct zxdg_toplevel_decoration_v1_listener zxdg_toplevel_decoration_v1_listener = { .configure = toplevel_decoration_v1_configure, }; After that you have (in KDE) your window magically server side decorated. My questions are (thinking in servers other than KDE): 1-Can I assume the server will send the configure event even if I have not sent a set_mode request? 2-If the mode reported by the configure event is not 2, must we need to send a set_mode(2) request and wait for the next configure event before attaching the first buffer to the wl_surface? Thanks Mariano Alvarez
Re: protocol questions
On quarta-feira, 3 de abril de 2013 12.43.35, Kristian Høgsberg wrote: > > But the client may still want to popup a grabbing window (e.g. system-tray > > menu) in response to other event (e.g. dbus event) indirectly caused > > (handler in another process) by the user input. > > I can't think of anything that does this in any desktop environment > I've ever seen. The only case I currently have of something grabbing my X server is the GPG Agent pinentry program. But in that case, we've discussed in the past that we actually want to solve this problem differently: inform the compositor that this is a password / input-sensitive dialog and let the compositor handle it as it will. Anything else that pops up in my desktop won't get focus due to focus-stealing prevention anyway. The one thing that needs to break out of that are out-of-process windows that still logically belong to the same "application". KDE, for example, does that for the KWallet password but also for those file transfer dialogs, showing progress. You can also easily go further, like having a separate settings application for configuring an application, but launched from inside the application. Do we have a defined way of transferring focus and modality, enforcing stacking order? -- Thiago Macieira - thiago.macieira (AT) intel.com Software Architect - Intel Open Source Technology Center signature.asc Description: This is a digitally signed message part. ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Wed, Apr 3, 2013 at 1:18 PM, Daniel Stone wrote: > Hi, > > On 3 April 2013 18:01, Yichao Yu wrote: > > Yes I am talking about menu not notification (sorry the name is status > > notifier[1] instead of status notification), which is the system tray > > protocol. > > Ah OK, I see. In this case though, there's still a user input event > which triggers it, so I don't see how providing the serial has much > effect here. > But that is from a different process. > > > Also having to send a serial of the event which triggers the action is > > really unfriendly to toolkits and all of them may just have to save the > last > > serial globally (as what is done in weston clients now) and use it when > > sending out the requests, what is the point then of letting the client to > > save it instead of just saving it in the compositor? And is having a > focus > > really not enough? > > While it's mainly to make things explicit and eliminate all sorts of > focus races, you also need it for multi-pointer support, to > disambiguate which pointer you mean. > This should at most be a seat argument. > > Cheers, > Daniel > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
Hi, On 3 April 2013 18:01, Yichao Yu wrote: > Yes I am talking about menu not notification (sorry the name is status > notifier[1] instead of status notification), which is the system tray > protocol. Ah OK, I see. In this case though, there's still a user input event which triggers it, so I don't see how providing the serial has much effect here. > Also having to send a serial of the event which triggers the action is > really unfriendly to toolkits and all of them may just have to save the last > serial globally (as what is done in weston clients now) and use it when > sending out the requests, what is the point then of letting the client to > save it instead of just saving it in the compositor? And is having a focus > really not enough? While it's mainly to make things explicit and eliminate all sorts of focus races, you also need it for multi-pointer support, to disambiguate which pointer you mean. Cheers, Daniel ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Wed, Apr 3, 2013 at 12:56 PM, Daniel Stone wrote: > Hi Yichao, > > On 3 April 2013 17:50, Yichao Yu wrote: > > On Wed, Apr 3, 2013 at 12:43 PM, Kristian Høgsberg > > wrote: > >> I can't think of anything that does this in any desktop environment > >> I've ever seen. If as usecase for this comes up we can certainly add > >> it, or any desktop environment can define its own extension to allow > >> this kind of behavior. But think about it - spontaneously popping up > >> a window that grabs pointer and keyboard input is not a nice thing to > >> do. Typically a system-tray would pop up a tooltip or a notification > >> bubble, and then maybe you can go click on it to popup a menu. > > > > I am talking about real use case, which is the statusnotification > protocol. > > Why is having a focus not enough for this?? > > I think the point Kristian's trying to make is that the user > interaction is different. When you launch a popup menu, it grabs all > input, and the user cannot interact with any other application unless > it's dismissed. Notification windows usually appear and slide away > after a small time, but never exclusively grab input and prevent the > user from interacting with other applications until they're explicitly > dismissed. If that happened every time I received an email, I > would've smashed my laptop to bits by now. > > Pop-up has a very specific meaning here: it's a menu launched by the > active application in response to user input, which can be dismissed > without penalty either when the user interacts with an element outside > the menu, or, e.g., when the screensaver activates. Notifications are > not the same thing, because dismissing them can involve some kind of > loss. If you want a notification that does not capture all input, > then we can work out a surface type for that. If you really do want a > notification that does capture all input, then it's not a pop-up menu, > it's a modal dialog or similar, as we already have for password > prompts. > Yes I am talking about menu not notification (sorry the name is status notifier[1] instead of status notification), which is the system tray protocol. Also having to send a serial of the event which triggers the action is really unfriendly to toolkits and all of them may just have to save the last serial globally (as what is done in weston clients now) and use it when sending out the requests, what is the point then of letting the client to save it instead of just saving it in the compositor? And is having a focus really not enough? [1] http://www.notmart.org/misc/statusnotifieritem/basic-design.html > > Cheers, > Daniel > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
Hi Yichao, On 3 April 2013 17:50, Yichao Yu wrote: > On Wed, Apr 3, 2013 at 12:43 PM, Kristian Høgsberg > wrote: >> I can't think of anything that does this in any desktop environment >> I've ever seen. If as usecase for this comes up we can certainly add >> it, or any desktop environment can define its own extension to allow >> this kind of behavior. But think about it - spontaneously popping up >> a window that grabs pointer and keyboard input is not a nice thing to >> do. Typically a system-tray would pop up a tooltip or a notification >> bubble, and then maybe you can go click on it to popup a menu. > > I am talking about real use case, which is the statusnotification protocol. > Why is having a focus not enough for this?? I think the point Kristian's trying to make is that the user interaction is different. When you launch a popup menu, it grabs all input, and the user cannot interact with any other application unless it's dismissed. Notification windows usually appear and slide away after a small time, but never exclusively grab input and prevent the user from interacting with other applications until they're explicitly dismissed. If that happened every time I received an email, I would've smashed my laptop to bits by now. Pop-up has a very specific meaning here: it's a menu launched by the active application in response to user input, which can be dismissed without penalty either when the user interacts with an element outside the menu, or, e.g., when the screensaver activates. Notifications are not the same thing, because dismissing them can involve some kind of loss. If you want a notification that does not capture all input, then we can work out a surface type for that. If you really do want a notification that does capture all input, then it's not a pop-up menu, it's a modal dialog or similar, as we already have for password prompts. Cheers, Daniel ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Wed, Apr 3, 2013 at 12:43 PM, Kristian Høgsberg wrote: > On Wed, Apr 03, 2013 at 12:04:46PM -0400, Yichao Yu wrote: > > On Wed, Apr 3, 2013 at 11:16 AM, Kristian Høgsberg >wrote: > > > > > On Wed, Apr 3, 2013 at 10:59 AM, Yichao Yu wrote: > > > > > > > > > > > > > > > > On Wed, Apr 3, 2013 at 12:00 AM, Daniel Stone > > > wrote: > > > >> > > > >> Hi, > > > >> > > > >> On 3 April 2013 03:09, Kristian Høgsberg > wrote: > > > >> > On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: > > > >> >> - It looks like I can't trigger a popup from a key or touch > event, > > > >> >> because set_popup requires a serial that corresponds to an > implicit > > > >> >> pointer grab. That is sad, I like the menu key... > > > >> > > > > >> > Yes, it looks like we'll need new protocol for that. It's also > not > > > >> > possible to trigger keyboard move or resize of windows. > > > >> > > > >> Hm, do we really need new protocol for this, or, given that serials > > > >> are display-global, can we just bump wl_shell_surface to v2 and note > > > >> that v2 and above accept _either_ a key or button press for the > serial > > > >> argument to set_popup? I don't see any potential for confusion or > > > >> getting things wrong, and it saves everyone a lot of really tedious > > > >> typing. > > > > > > > > > > > > Why should there be a serial at all? What if the client got some > input > > > from > > > > elsewhere, e.g. popup a warning or sth like that because of a > hardware > > > > error?? > > > > > > That would just be a regular top-level window or a transient window. > > > The popup window type is specifically for popup menus or dropdowns, > > > which activate in response to user action and under X grabs mouse and > > > keyboard. Under wayland the grab is internal to the server and tied > > > to the popup window, but we still an input event serial to make sure > > > an application can only pop up a grabbing window in response to a user > > > input. > > > > > > > But the client may still want to popup a grabbing window (e.g. > system-tray > > menu) in response to other event (e.g. dbus event) indirectly caused > > (handler in another process) by the user input. > > I can't think of anything that does this in any desktop environment > I've ever seen. If as usecase for this comes up we can certainly add > it, or any desktop environment can define its own extension to allow > this kind of behavior. But think about it - spontaneously popping up > a window that grabs pointer and keyboard input is not a nice thing to > do. Typically a system-tray would pop up a tooltip or a notification > bubble, and then maybe you can go click on it to popup a menu. > I am talking about real use case, which is the statusnotification protocol. Why is having a focus not enough for this?? > > > > Kristian > > > > > > >> >> - The wl_pointer interface seems to be a bit weak wrt to device > > > >> >> properties. I would at least expect to learn about the number of > > > >> >> buttons and right-handed vs left-handed, etc. > > > >> > > > > >> > Daniel covered this, though I do think that we should be able to > > > >> > determine the set of all buttons supported by all mice and > communicate > > > >> > to the client if there's a case for that. > > > >> > > > >> Certainly evdev lets you see which buttons are supported by a > pointer, > > > >> as well as which keys are supported by a keyboard - at least to the > > > >> extent that the hardware actually exposes this through HID, which is > > > >> even less reliable than EDID. But it's definitely possible to do, > and > > > >> AFAICT hardware tends to err on the side of exposing too many > > > >> capabilities, rather than too few (i.e. you're not going to get an > > > >> event for a mouse button we previously claimed not to have). > > > >> > > > >> While we're here though, I'd love to clarify what a value of 1.0 in > a > > > >> wl_pointer::axis event means. Right now, anything with a wheel will > > > >> send 1.0 per wheel click, whereas two-finger scrolling on touchpads > > > >> will send 1.0 per pixel. This makes axis events totally unusable > for > > > >> when you have a mouse and touchpad both: half your scrolls are going > > > >> to be wrong. Can we settle on, and document, 1.0 as an arbitrary > unit > > > >> roughly equivalent to one 'click' of a wheel, and scale > appropriately > > > >> in the touchpad driver? > > > >> > > > >> And if we're going to stick with evdev BTN_* codes for button > events, > > > >> we should probably document that one too. > > > >> > > > >> Cheers, > > > >> Daniel > > > >> ___ > > > >> wayland-devel mailing list > > > >> wayland-devel@lists.freedesktop.org > > > >> http://lists.freedesktop.org/mailman/listinfo/wayland-devel > > > > > > > > > > > > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-d
Re: protocol questions
On Wed, Apr 03, 2013 at 12:04:46PM -0400, Yichao Yu wrote: > On Wed, Apr 3, 2013 at 11:16 AM, Kristian Høgsberg wrote: > > > On Wed, Apr 3, 2013 at 10:59 AM, Yichao Yu wrote: > > > > > > > > > > > > On Wed, Apr 3, 2013 at 12:00 AM, Daniel Stone > > wrote: > > >> > > >> Hi, > > >> > > >> On 3 April 2013 03:09, Kristian Høgsberg wrote: > > >> > On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: > > >> >> - It looks like I can't trigger a popup from a key or touch event, > > >> >> because set_popup requires a serial that corresponds to an implicit > > >> >> pointer grab. That is sad, I like the menu key... > > >> > > > >> > Yes, it looks like we'll need new protocol for that. It's also not > > >> > possible to trigger keyboard move or resize of windows. > > >> > > >> Hm, do we really need new protocol for this, or, given that serials > > >> are display-global, can we just bump wl_shell_surface to v2 and note > > >> that v2 and above accept _either_ a key or button press for the serial > > >> argument to set_popup? I don't see any potential for confusion or > > >> getting things wrong, and it saves everyone a lot of really tedious > > >> typing. > > > > > > > > > Why should there be a serial at all? What if the client got some input > > from > > > elsewhere, e.g. popup a warning or sth like that because of a hardware > > > error?? > > > > That would just be a regular top-level window or a transient window. > > The popup window type is specifically for popup menus or dropdowns, > > which activate in response to user action and under X grabs mouse and > > keyboard. Under wayland the grab is internal to the server and tied > > to the popup window, but we still an input event serial to make sure > > an application can only pop up a grabbing window in response to a user > > input. > > > > But the client may still want to popup a grabbing window (e.g. system-tray > menu) in response to other event (e.g. dbus event) indirectly caused > (handler in another process) by the user input. I can't think of anything that does this in any desktop environment I've ever seen. If as usecase for this comes up we can certainly add it, or any desktop environment can define its own extension to allow this kind of behavior. But think about it - spontaneously popping up a window that grabs pointer and keyboard input is not a nice thing to do. Typically a system-tray would pop up a tooltip or a notification bubble, and then maybe you can go click on it to popup a menu. > > Kristian > > > > >> >> - The wl_pointer interface seems to be a bit weak wrt to device > > >> >> properties. I would at least expect to learn about the number of > > >> >> buttons and right-handed vs left-handed, etc. > > >> > > > >> > Daniel covered this, though I do think that we should be able to > > >> > determine the set of all buttons supported by all mice and communicate > > >> > to the client if there's a case for that. > > >> > > >> Certainly evdev lets you see which buttons are supported by a pointer, > > >> as well as which keys are supported by a keyboard - at least to the > > >> extent that the hardware actually exposes this through HID, which is > > >> even less reliable than EDID. But it's definitely possible to do, and > > >> AFAICT hardware tends to err on the side of exposing too many > > >> capabilities, rather than too few (i.e. you're not going to get an > > >> event for a mouse button we previously claimed not to have). > > >> > > >> While we're here though, I'd love to clarify what a value of 1.0 in a > > >> wl_pointer::axis event means. Right now, anything with a wheel will > > >> send 1.0 per wheel click, whereas two-finger scrolling on touchpads > > >> will send 1.0 per pixel. This makes axis events totally unusable for > > >> when you have a mouse and touchpad both: half your scrolls are going > > >> to be wrong. Can we settle on, and document, 1.0 as an arbitrary unit > > >> roughly equivalent to one 'click' of a wheel, and scale appropriately > > >> in the touchpad driver? > > >> > > >> And if we're going to stick with evdev BTN_* codes for button events, > > >> we should probably document that one too. > > >> > > >> Cheers, > > >> Daniel > > >> ___ > > >> wayland-devel mailing list > > >> wayland-devel@lists.freedesktop.org > > >> http://lists.freedesktop.org/mailman/listinfo/wayland-devel > > > > > > > > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Wed, Apr 3, 2013 at 11:16 AM, Kristian Høgsberg wrote: > On Wed, Apr 3, 2013 at 10:59 AM, Yichao Yu wrote: > > > > > > > > On Wed, Apr 3, 2013 at 12:00 AM, Daniel Stone > wrote: > >> > >> Hi, > >> > >> On 3 April 2013 03:09, Kristian Høgsberg wrote: > >> > On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: > >> >> - It looks like I can't trigger a popup from a key or touch event, > >> >> because set_popup requires a serial that corresponds to an implicit > >> >> pointer grab. That is sad, I like the menu key... > >> > > >> > Yes, it looks like we'll need new protocol for that. It's also not > >> > possible to trigger keyboard move or resize of windows. > >> > >> Hm, do we really need new protocol for this, or, given that serials > >> are display-global, can we just bump wl_shell_surface to v2 and note > >> that v2 and above accept _either_ a key or button press for the serial > >> argument to set_popup? I don't see any potential for confusion or > >> getting things wrong, and it saves everyone a lot of really tedious > >> typing. > > > > > > Why should there be a serial at all? What if the client got some input > from > > elsewhere, e.g. popup a warning or sth like that because of a hardware > > error?? > > That would just be a regular top-level window or a transient window. > The popup window type is specifically for popup menus or dropdowns, > which activate in response to user action and under X grabs mouse and > keyboard. Under wayland the grab is internal to the server and tied > to the popup window, but we still an input event serial to make sure > an application can only pop up a grabbing window in response to a user > input. > But the client may still want to popup a grabbing window (e.g. system-tray menu) in response to other event (e.g. dbus event) indirectly caused (handler in another process) by the user input. > > Kristian > > >> >> - The wl_pointer interface seems to be a bit weak wrt to device > >> >> properties. I would at least expect to learn about the number of > >> >> buttons and right-handed vs left-handed, etc. > >> > > >> > Daniel covered this, though I do think that we should be able to > >> > determine the set of all buttons supported by all mice and communicate > >> > to the client if there's a case for that. > >> > >> Certainly evdev lets you see which buttons are supported by a pointer, > >> as well as which keys are supported by a keyboard - at least to the > >> extent that the hardware actually exposes this through HID, which is > >> even less reliable than EDID. But it's definitely possible to do, and > >> AFAICT hardware tends to err on the side of exposing too many > >> capabilities, rather than too few (i.e. you're not going to get an > >> event for a mouse button we previously claimed not to have). > >> > >> While we're here though, I'd love to clarify what a value of 1.0 in a > >> wl_pointer::axis event means. Right now, anything with a wheel will > >> send 1.0 per wheel click, whereas two-finger scrolling on touchpads > >> will send 1.0 per pixel. This makes axis events totally unusable for > >> when you have a mouse and touchpad both: half your scrolls are going > >> to be wrong. Can we settle on, and document, 1.0 as an arbitrary unit > >> roughly equivalent to one 'click' of a wheel, and scale appropriately > >> in the touchpad driver? > >> > >> And if we're going to stick with evdev BTN_* codes for button events, > >> we should probably document that one too. > >> > >> Cheers, > >> Daniel > >> ___ > >> wayland-devel mailing list > >> wayland-devel@lists.freedesktop.org > >> http://lists.freedesktop.org/mailman/listinfo/wayland-devel > > > > > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Wed, Apr 3, 2013 at 10:59 AM, Yichao Yu wrote: > > > > On Wed, Apr 3, 2013 at 12:00 AM, Daniel Stone wrote: >> >> Hi, >> >> On 3 April 2013 03:09, Kristian Høgsberg wrote: >> > On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: >> >> - It looks like I can't trigger a popup from a key or touch event, >> >> because set_popup requires a serial that corresponds to an implicit >> >> pointer grab. That is sad, I like the menu key... >> > >> > Yes, it looks like we'll need new protocol for that. It's also not >> > possible to trigger keyboard move or resize of windows. >> >> Hm, do we really need new protocol for this, or, given that serials >> are display-global, can we just bump wl_shell_surface to v2 and note >> that v2 and above accept _either_ a key or button press for the serial >> argument to set_popup? I don't see any potential for confusion or >> getting things wrong, and it saves everyone a lot of really tedious >> typing. > > > Why should there be a serial at all? What if the client got some input from > elsewhere, e.g. popup a warning or sth like that because of a hardware > error?? That would just be a regular top-level window or a transient window. The popup window type is specifically for popup menus or dropdowns, which activate in response to user action and under X grabs mouse and keyboard. Under wayland the grab is internal to the server and tied to the popup window, but we still an input event serial to make sure an application can only pop up a grabbing window in response to a user input. Kristian >> >> - The wl_pointer interface seems to be a bit weak wrt to device >> >> properties. I would at least expect to learn about the number of >> >> buttons and right-handed vs left-handed, etc. >> > >> > Daniel covered this, though I do think that we should be able to >> > determine the set of all buttons supported by all mice and communicate >> > to the client if there's a case for that. >> >> Certainly evdev lets you see which buttons are supported by a pointer, >> as well as which keys are supported by a keyboard - at least to the >> extent that the hardware actually exposes this through HID, which is >> even less reliable than EDID. But it's definitely possible to do, and >> AFAICT hardware tends to err on the side of exposing too many >> capabilities, rather than too few (i.e. you're not going to get an >> event for a mouse button we previously claimed not to have). >> >> While we're here though, I'd love to clarify what a value of 1.0 in a >> wl_pointer::axis event means. Right now, anything with a wheel will >> send 1.0 per wheel click, whereas two-finger scrolling on touchpads >> will send 1.0 per pixel. This makes axis events totally unusable for >> when you have a mouse and touchpad both: half your scrolls are going >> to be wrong. Can we settle on, and document, 1.0 as an arbitrary unit >> roughly equivalent to one 'click' of a wheel, and scale appropriately >> in the touchpad driver? >> >> And if we're going to stick with evdev BTN_* codes for button events, >> we should probably document that one too. >> >> Cheers, >> Daniel >> ___ >> wayland-devel mailing list >> wayland-devel@lists.freedesktop.org >> http://lists.freedesktop.org/mailman/listinfo/wayland-devel > > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Wed, Apr 3, 2013 at 12:00 AM, Daniel Stone wrote: > Hi, > > On 3 April 2013 03:09, Kristian Høgsberg wrote: > > On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: > >> - It looks like I can't trigger a popup from a key or touch event, > >> because set_popup requires a serial that corresponds to an implicit > >> pointer grab. That is sad, I like the menu key... > > > > Yes, it looks like we'll need new protocol for that. It's also not > > possible to trigger keyboard move or resize of windows. > > Hm, do we really need new protocol for this, or, given that serials > are display-global, can we just bump wl_shell_surface to v2 and note > that v2 and above accept _either_ a key or button press for the serial > argument to set_popup? I don't see any potential for confusion or > getting things wrong, and it saves everyone a lot of really tedious > typing. > Why should there be a serial at all? What if the client got some input from elsewhere, e.g. popup a warning or sth like that because of a hardware error?? > > >> - The wl_pointer interface seems to be a bit weak wrt to device > >> properties. I would at least expect to learn about the number of > >> buttons and right-handed vs left-handed, etc. > > > > Daniel covered this, though I do think that we should be able to > > determine the set of all buttons supported by all mice and communicate > > to the client if there's a case for that. > > Certainly evdev lets you see which buttons are supported by a pointer, > as well as which keys are supported by a keyboard - at least to the > extent that the hardware actually exposes this through HID, which is > even less reliable than EDID. But it's definitely possible to do, and > AFAICT hardware tends to err on the side of exposing too many > capabilities, rather than too few (i.e. you're not going to get an > event for a mouse button we previously claimed not to have). > > While we're here though, I'd love to clarify what a value of 1.0 in a > wl_pointer::axis event means. Right now, anything with a wheel will > send 1.0 per wheel click, whereas two-finger scrolling on touchpads > will send 1.0 per pixel. This makes axis events totally unusable for > when you have a mouse and touchpad both: half your scrolls are going > to be wrong. Can we settle on, and document, 1.0 as an arbitrary unit > roughly equivalent to one 'click' of a wheel, and scale appropriately > in the touchpad driver? > > And if we're going to stick with evdev BTN_* codes for button events, > we should probably document that one too. > > Cheers, > Daniel > ___ > wayland-devel mailing list > wayland-devel@lists.freedesktop.org > http://lists.freedesktop.org/mailman/listinfo/wayland-devel > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
Hi, On 3 April 2013 03:09, Kristian Høgsberg wrote: > On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: >> - It looks like I can't trigger a popup from a key or touch event, >> because set_popup requires a serial that corresponds to an implicit >> pointer grab. That is sad, I like the menu key... > > Yes, it looks like we'll need new protocol for that. It's also not > possible to trigger keyboard move or resize of windows. Hm, do we really need new protocol for this, or, given that serials are display-global, can we just bump wl_shell_surface to v2 and note that v2 and above accept _either_ a key or button press for the serial argument to set_popup? I don't see any potential for confusion or getting things wrong, and it saves everyone a lot of really tedious typing. >> - The wl_pointer interface seems to be a bit weak wrt to device >> properties. I would at least expect to learn about the number of >> buttons and right-handed vs left-handed, etc. > > Daniel covered this, though I do think that we should be able to > determine the set of all buttons supported by all mice and communicate > to the client if there's a case for that. Certainly evdev lets you see which buttons are supported by a pointer, as well as which keys are supported by a keyboard - at least to the extent that the hardware actually exposes this through HID, which is even less reliable than EDID. But it's definitely possible to do, and AFAICT hardware tends to err on the side of exposing too many capabilities, rather than too few (i.e. you're not going to get an event for a mouse button we previously claimed not to have). While we're here though, I'd love to clarify what a value of 1.0 in a wl_pointer::axis event means. Right now, anything with a wheel will send 1.0 per wheel click, whereas two-finger scrolling on touchpads will send 1.0 per pixel. This makes axis events totally unusable for when you have a mouse and touchpad both: half your scrolls are going to be wrong. Can we settle on, and document, 1.0 as an arbitrary unit roughly equivalent to one 'click' of a wheel, and scale appropriately in the touchpad driver? And if we're going to stick with evdev BTN_* codes for button events, we should probably document that one too. Cheers, Daniel ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Sat, Mar 30, 2013 at 01:31:34AM -0400, Matthias Clasen wrote: > Here are a few questions/observations I had while studying the protocol docs: Most of these have been covered in responses, but let me sum up. > - The use of serials in events seems a bit inconsistent. Most > wl_pointer events have serials, but axis doesn't. wl_keyboard > enter/leave events do. wl_data_offer.enter does, but the corresponding > leave/motion events don't. Is there a rationale for this ? First of all - yes, we need to document which serial is expected whenever a requests takes a serial. Now, as for when to use a serial in an event: if the event signals a change in compositor state that makes the compositor handle requests differently. Which sounds a little abstract or vauge, but for example, a wl_pointer entering a window means that the client that owns the window can now set the cursor. A button press starts an implicit grab and while that's active the client can ask the compositor to start moving or resizing its window. Conversely, the pointer moving around within a surface doesn't affect how the compositor makes decisions about incoming requests and as such we don't have serials in those events. > - Various input events have a time field. The spec doesn't really say > anything about this. What is it good for, and what units are these - > monotonic time ? The units are miliseconds, but the timestamp base is undefined and they're only good for comparing against other input event timestamps. They are only available for input events that actually correspond to user activity (moving the mouse, clicking a button or key) as opposed to side-effect events suchs as enter/leave, which can happen without user interaction (a surface goes away spontaneously and the underlying surface receives a wl_pointer enter event). They're useful for determining double click speed, typing speed (if that's ever useful) or as Thiago suggests, for determining if a key press came before a mouse button click. Other uses are determining pointer speed or gesture recognition, where accurate timestamps for touch motion events are required. > - It looks like I can't trigger a popup from a key or touch event, > because set_popup requires a serial that corresponds to an implicit > pointer grab. That is sad, I like the menu key... Yes, it looks like we'll need new protocol for that. It's also not possible to trigger keyboard move or resize of windows. > - Still on popups, I don't see a way for the client to dismiss the > popup, or is that handled by just destroying the surface ? Yes, just destroy the surface. The first popup starts the grab, adding more popups will keep the grab in place, but once all popups are gone, the grab is terminated. So it's possible to popup window A, then window B and then destroy A, and keep the grab. Once B is destroyed, the grab terminates. There is no way for a client to terminate the grab and keep the windows in place. > - Buffer transformations - fun. How do these relate to each of the following ? >- resize edges >- transient offset >- buffer attach x/y >- input/opaque/damage regions >- surface x/y in motion events All these are in the surface coordinate system and aren't affect by buffer transformations. The buffer transforms lets a client indicate that buffer contents has been renderered according to one of the 8 rotation/flip combinations. The intention is that a client can render the buffer contents according to the output transform of the output it's displaying on. This allows the compositor to pageflip to client buffers even on displays that are transformed in one of these ways. > - What is a wl_touch.frame event ? Weston doesn't seem to generate those... Daniel covered this. > - Various strings in the protocol: title, class, output model/make. > Are all of these required/known to be UTF-8 ? The class is documented > as being a file path, which is bad news wrt to encodings... Yes, all are UTF-8. What would you recommend for the class property? > - The wl_pointer interface seems to be a bit weak wrt to device > properties. I would at least expect to learn about the number of > buttons and right-handed vs left-handed, etc. Daniel covered this, though I do think that we should be able to determine the set of all buttons supported by all mice and communicate to the client if there's a case for that. Kristian ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Sat, Mar 30, 2013 at 06:17:30PM -0700, Thiago Macieira wrote: > On sábado, 30 de março de 2013 17.52.33, Nick Kisialiou wrote: > > What about "long int" type to store the time stamps? Even in microseconds > > it will take longer than 100 years to overflow 2^63. > > That requires changing the protocol. Overflow also isn't a problem at all. The timestamps are carefully uint32_t so you can do if (click_time_2 - click_time_1 < double_click_time) /* double click */ and get the right (well defined) result even in the face of either or the click timestamps wrapping around. Kristian ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Sat, Mar 30, 2013 at 8:44 PM, Daniel Stone wrote: > Hi, > > On 30 March 2013 16:55, Thiago Macieira wrote: >> >> On sábado, 30 de março de 2013 09.34.24, Matthias Clasen wrote: >> > > Monotonic (ideally) time in an undefined domain, i.e. they're only >> > > meaningful on relation to each other. >> > >> > What can you do with them ? For the use case that Giulio mentioned >> > (double-click detection), I'd need to know at least if the difference >> > between two times is seconds or milliseconds or microseconds... >> >> The protocol needs to specify the unit. It can't be dependent on the >> device >> driver, that makes no sense. If it's in milliseconds, it will overflow >> every >> 49.7 days. If it's microseconds, it will overflow every 71.6 minutes. > > > Yes, they are in milliseconds, I just explained it poorly. > >> >> It also needs to specify which timestamps are in the same time domain. Can >> two >> timestamps be compared to each other only if: >> >> - they are in the same input device (same mouse, same keyboard), but not >> across devices >> - they are in the same seat, but not across seats >> - they are in input event messages, but not other types of messages that >> carry timestamps >> - no restriction > > > Personally, I think either #1 or #2. Definitely not #3 or #4. We want to > be able to use the evdev timestamps rather than gettimeofday() when we > receive it, so we can ensure that if someone clicks twice slowly, and the > compositor takes a while to process the same event, it's not interpreted as > a double-click. > >> >> For example, imagine the case of trying to ensure that a Ctrl key was >> pressed >> before a mouse click happened, after the events were plucked out of the >> event >> stream. >> >> Or is there another, recommended way of doing that, such as by using the >> serials? > > > Hmmm. I was going to say using the event order, but it all depends on which > order the devices were read in. So I guess for this case we'd need to go > with #2. As a side note, #2 also makes more sense for the wl_pointer which is an agrigate from more than one source. --Jason Ekstrand > > Cheers, > Daniel > > ___ > wayland-devel mailing list > wayland-devel@lists.freedesktop.org > http://lists.freedesktop.org/mailman/listinfo/wayland-devel > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
Hi, On 30 March 2013 16:55, Thiago Macieira wrote: > On sábado, 30 de março de 2013 09.34.24, Matthias Clasen wrote: > > > Monotonic (ideally) time in an undefined domain, i.e. they're only > > > meaningful on relation to each other. > > > > What can you do with them ? For the use case that Giulio mentioned > > (double-click detection), I'd need to know at least if the difference > > between two times is seconds or milliseconds or microseconds... > > The protocol needs to specify the unit. It can't be dependent on the device > driver, that makes no sense. If it's in milliseconds, it will overflow > every > 49.7 days. If it's microseconds, it will overflow every 71.6 minutes. > Yes, they are in milliseconds, I just explained it poorly. > It also needs to specify which timestamps are in the same time domain. Can > two > timestamps be compared to each other only if: > > - they are in the same input device (same mouse, same keyboard), but not > across devices > - they are in the same seat, but not across seats > - they are in input event messages, but not other types of messages that > carry timestamps > - no restriction > Personally, I think either #1 or #2. Definitely not #3 or #4. We want to be able to use the evdev timestamps rather than gettimeofday() when we receive it, so we can ensure that if someone clicks twice slowly, and the compositor takes a while to process the same event, it's not interpreted as a double-click. > For example, imagine the case of trying to ensure that a Ctrl key was > pressed > before a mouse click happened, after the events were plucked out of the > event > stream. > > Or is there another, recommended way of doing that, such as by using the > serials? > Hmmm. I was going to say using the event order, but it all depends on which order the devices were read in. So I guess for this case we'd need to go with #2. Cheers, Daniel ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On sábado, 30 de março de 2013 17.52.33, Nick Kisialiou wrote: > What about "long int" type to store the time stamps? Even in microseconds > it will take longer than 100 years to overflow 2^63. That requires changing the protocol. -- Thiago Macieira - thiago.macieira (AT) intel.com Software Architect - Intel Open Source Technology Center signature.asc Description: This is a digitally signed message part. ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
What about "long int" type to store the time stamps? Even in microseconds it will take longer than 100 years to overflow 2^63. NK On Sat, Mar 30, 2013 at 9:55 AM, Thiago Macieira wrote: > On sábado, 30 de março de 2013 09.34.24, Matthias Clasen wrote: > > >> - Various input events have a time field. The spec doesn't really say > > >> anything about this. What is it good for, and what units are these - > > >> monotonic time ? > > > > > > Monotonic (ideally) time in an undefined domain, i.e. they're only > > > meaningful on relation to each other. > > > > What can you do with them ? For the use case that Giulio mentioned > > (double-click detection), I'd need to know at least if the difference > > between two times is seconds or milliseconds or microseconds... > > The protocol needs to specify the unit. It can't be dependent on the device > driver, that makes no sense. If it's in milliseconds, it will overflow > every > 49.7 days. If it's microseconds, it will overflow every 71.6 minutes. > > It also needs to specify which timestamps are in the same time domain. Can > two > timestamps be compared to each other only if: > > - they are in the same input device (same mouse, same keyboard), but not > across devices > - they are in the same seat, but not across seats > - they are in input event messages, but not other types of messages that > carry timestamps > - no restriction > > For example, imagine the case of trying to ensure that a Ctrl key was > pressed > before a mouse click happened, after the events were plucked out of the > event > stream. > > Or is there another, recommended way of doing that, such as by using the > serials? > > -- > Thiago Macieira - thiago.macieira (AT) intel.com > Software Architect - Intel Open Source Technology Center > > ___ > wayland-devel mailing list > wayland-devel@lists.freedesktop.org > http://lists.freedesktop.org/mailman/listinfo/wayland-devel > > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On sábado, 30 de março de 2013 09.34.24, Matthias Clasen wrote: > >> - Various input events have a time field. The spec doesn't really say > >> anything about this. What is it good for, and what units are these - > >> monotonic time ? > > > > Monotonic (ideally) time in an undefined domain, i.e. they're only > > meaningful on relation to each other. > > What can you do with them ? For the use case that Giulio mentioned > (double-click detection), I'd need to know at least if the difference > between two times is seconds or milliseconds or microseconds... The protocol needs to specify the unit. It can't be dependent on the device driver, that makes no sense. If it's in milliseconds, it will overflow every 49.7 days. If it's microseconds, it will overflow every 71.6 minutes. It also needs to specify which timestamps are in the same time domain. Can two timestamps be compared to each other only if: - they are in the same input device (same mouse, same keyboard), but not across devices - they are in the same seat, but not across seats - they are in input event messages, but not other types of messages that carry timestamps - no restriction For example, imagine the case of trying to ensure that a Ctrl key was pressed before a mouse click happened, after the events were plucked out of the event stream. Or is there another, recommended way of doing that, such as by using the serials? -- Thiago Macieira - thiago.macieira (AT) intel.com Software Architect - Intel Open Source Technology Center signature.asc Description: This is a digitally signed message part. ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
Hi, On 30 March 2013 13:34, Matthias Clasen wrote: > On Sat, Mar 30, 2013 at 7:56 AM, Daniel Stone wrote: > >> - Various input events have a time field. The spec doesn't really say >>> anything about this. What is it good for, and what units are these - >>> monotonic time ? >>> >> >> Monotonic (ideally) time in an undefined domain, i.e. they're only >> meaningful on relation to each other. >> > > What can you do with them ? For the use case that Giulio mentioned > (double-click detection), I'd need to know at least if the difference > between two times is seconds or milliseconds or microseconds... > Oh sorry, milliseconds. Just with an undefined base, i.e. they don't necessarily correlate to gettimeofday() or CLOCK_MONOTONIC. > - Still on popups, I don't see a way for the client to dismiss the >>> popup, or is that handled by just destroying the surface ? >>> >> >> Indeed, just destroy the surface or attach a NULL buffer. >> > > Good to know. I don't think the spec mentions at all that 'attach NULL > buffer' == unmap. > Mapping rules are specific to the surface type, but yes, indeed I can't think of any surface roles where that isn't the case. > - Buffer transformations - fun. How do these relate to each of the >>> following ? >>>- resize edges >>>- transient offset >>>- buffer attach x/y >>>- input/opaque/damage regions >>>- surface x/y in motion events >>> >> >> All the latter occur on surfaces rather than buffers, so are unaffected. >> Buffer transforms are meant to support situations like where your screen >> is rotated 90°, and your client can also render rotated in order to avoid >> that extra blit. So it doesn't affect the event pipeline at all, only the >> display pipeline. >> > > That sounds right for resize edgets and motion events, certainly. For some > of the others, at least the wording of the spec is not always very clear on > this point. E.g. for buffer attach x/y, the wl_surface.attach docs say: > > The x and y arguments specify the location of the new pending > buffer's upper left corner, relative to the current buffer's > upper left corner. > > See how it talks about the current buffer's upper left corner. Should that > say 'the surface's upper left corner, then ? > Yeah, except the wording to be a little more subtle to clarify that that the position change happens a) in surface co-ordinates, and b) when the buffer is attached. But this is the one I'm least sure about, in all honesty. :) Cheers, Daniel ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
On Sat, Mar 30, 2013 at 7:56 AM, Daniel Stone wrote: > > > >> - Various input events have a time field. The spec doesn't really say >> anything about this. What is it good for, and what units are these - >> monotonic time ? >> > > Monotonic (ideally) time in an undefined domain, i.e. they're only > meaningful on relation to each other. > > What can you do with them ? For the use case that Giulio mentioned (double-click detection), I'd need to know at least if the difference between two times is seconds or milliseconds or microseconds... - Still on popups, I don't see a way for the client to dismiss the >> popup, or is that handled by just destroying the surface ? >> > > Indeed, just destroy the surface or attach a NULL buffer. > Good to know. I don't think the spec mentions at all that 'attach NULL buffer' == unmap. - Buffer transformations - fun. How do these relate to each of the >> following ? >>- resize edges >>- transient offset >>- buffer attach x/y >>- input/opaque/damage regions >>- surface x/y in motion events >> > > All the latter occur on surfaces rather than buffers, so are unaffected. > Buffer transforms are meant to support situations like where your screen > is rotated 90°, and your client can also render rotated in order to avoid > that extra blit. So it doesn't affect the event pipeline at all, only the > display pipeline. > That sounds right for resize edgets and motion events, certainly. For some of the others, at least the wording of the spec is not always very clear on this point. E.g. for buffer attach x/y, the wl_surface.attach docs say: The x and y arguments specify the location of the new pending buffer's upper left corner, relative to the current buffer's upper left corner. See how it talks about the current buffer's upper left corner. Should that say 'the surface's upper left corner, then ? ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
Hi, On 30 March 2013 05:31, Matthias Clasen wrote: > Here are a few questions/observations I had while studying the protocol > docs: > > - The use of serials in events seems a bit inconsistent. Most > wl_pointer events have serials, but axis doesn't. wl_keyboard > enter/leave events do. wl_data_offer.enter does, but the corresponding > leave/motion events don't. Is there a rationale for this ? > Yes: serials are used for events which can be used to trigger other events, e.g. setting the pointer, launching a popup, starting a drag, etc. This is not something you tend to do from scroll or data events. > - Various input events have a time field. The spec doesn't really say > anything about this. What is it good for, and what units are these - > monotonic time ? > Monotonic (ideally) time in an undefined domain, i.e. they're only meaningful on relation to each other. > - It looks like I can't trigger a popup from a key or touch event, > because set_popup requires a serial that corresponds to an implicit > pointer grab. That is sad, I like the menu key... > Yeah, that'd be great to fix! > - Still on popups, I don't see a way for the client to dismiss the > popup, or is that handled by just destroying the surface ? > Indeed, just destroy the surface or attach a NULL buffer. > - Buffer transformations - fun. How do these relate to each of the > following ? >- resize edges >- transient offset >- buffer attach x/y >- input/opaque/damage regions >- surface x/y in motion events > All the latter occur on surfaces rather than buffers, so are unaffected. Buffer transforms are meant to support situations like where your screen is rotated 90°, and your client can also render rotated in order to avoid that extra blit. So it doesn't affect the event pipeline at all, only the display pipeline. > - What is a wl_touch.frame event ? Weston doesn't seem to generate those... > It's meant to indicate a natural boundary between touch events, à la a full EV_SYN. So you'd send touch events for every finger down, followed by frame, at which point you could perform gesture processing. > - The wl_pointer interface seems to be a bit weak wrt to device > properties. I would at least expect to learn about the number of > buttons and right-handed vs left-handed, etc. > wl_pointer is an aggregation of mice, not a single mouse, so we can't necessarily sensibly expose number of buttons. For right vs. left-handed, I'd expect the compositor to do the swap and clients never have to worry about it. If you want to expose that configuration, that should occur through private protocol. Cheers, Daniel ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: protocol questions
2013/3/30 Matthias Clasen > Here are a few questions/observations I had while studying the protocol > docs: > > - The use of serials in events seems a bit inconsistent. Most > wl_pointer events have serials, but axis doesn't. wl_keyboard > enter/leave events do. wl_data_offer.enter does, but the corresponding > leave/motion events don't. Is there a rationale for this ? > > - Various input events have a time field. The spec doesn't really say > anything about this. What is it good for, and what units are these - > monotonic time ? > The time can be used to calculate whether two clicks are a double click for instance. The timestamps are sent by evdev, and i think they are monotonic, but i'm not sure. > - It looks like I can't trigger a popup from a key or touch event, > because set_popup requires a serial that corresponds to an implicit > pointer grab. That is sad, I like the menu key... > Yes, this is a known bug which needs to be addressed. > - Still on popups, I don't see a way for the client to dismiss the > popup, or is that handled by just destroying the surface ? > I think the only other way apart destroying it is unmapping it, that is attaching a null buffer to the surface. > > - Buffer transformations - fun. How do these relate to each of the > following ? >- resize edges >- transient offset >- buffer attach x/y >- input/opaque/damage regions >- surface x/y in motion events > > - What is a wl_touch.frame event ? Weston doesn't seem to generate those... > > - Various strings in the protocol: title, class, output model/make. > Are all of these required/known to be UTF-8 ? The class is documented > as being a file path, which is bad news wrt to encodings... > > - The wl_pointer interface seems to be a bit weak wrt to device > properties. I would at least expect to learn about the number of > buttons and right-handed vs left-handed, etc. > > > Thanks for any insights you can share about these questions. > > > Matthias > ___ > wayland-devel mailing list > wayland-devel@lists.freedesktop.org > http://lists.freedesktop.org/mailman/listinfo/wayland-devel > ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
protocol questions
Here are a few questions/observations I had while studying the protocol docs: - The use of serials in events seems a bit inconsistent. Most wl_pointer events have serials, but axis doesn't. wl_keyboard enter/leave events do. wl_data_offer.enter does, but the corresponding leave/motion events don't. Is there a rationale for this ? - Various input events have a time field. The spec doesn't really say anything about this. What is it good for, and what units are these - monotonic time ? - It looks like I can't trigger a popup from a key or touch event, because set_popup requires a serial that corresponds to an implicit pointer grab. That is sad, I like the menu key... - Still on popups, I don't see a way for the client to dismiss the popup, or is that handled by just destroying the surface ? - Buffer transformations - fun. How do these relate to each of the following ? - resize edges - transient offset - buffer attach x/y - input/opaque/damage regions - surface x/y in motion events - What is a wl_touch.frame event ? Weston doesn't seem to generate those... - Various strings in the protocol: title, class, output model/make. Are all of these required/known to be UTF-8 ? The class is documented as being a file path, which is bad news wrt to encodings... - The wl_pointer interface seems to be a bit weak wrt to device properties. I would at least expect to learn about the number of buttons and right-handed vs left-handed, etc. Thanks for any insights you can share about these questions. Matthias ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Re: Wire protocol questions
On Mon, Feb 18, 2013 at 10:35:55PM +0100, Peter Hultqvist wrote: > I've been reading the > http://wayland.freedesktop.org/docs/html/sect-Protocol-Wire-Format.html > and done some testing with the wayland socket. > > From the docs I got the package sender object id, packet size and opcode. > I also understand the argument formats. > > From looking at the initial packets sent by the server and correlating > with the arguments in wayland.xml > > ObjectID: 1 (interface "wl_registry"?) > Opcode: 1 (event "global"?) > Name: uint > Interface: string > Version: uint > > I'm guessing that these Names are the objectID to be used for further > requests. Am I mixing up object ID and names here or are they the same? Global names are different from object IDs. They're both uint32_t, which is confusing, of course. The global name is an identifier you pass to wl_registry_bind, to bind the global object to an object ID. Once you've bound the global to an object ID, you can start sending requests to it and receive events from it using that object ID. Object IDs are private to each client and each client has it's own object ID namespace. > What I cannot figure out is how to map a request to an opcode integer. > global happened to be the first event in the wl_registry interface, is > the order in the xml-file dictating the opcodes? The order in the XML file defines the opcodes. > Also if there is any better source for this information such as a > specific source code file I'm eager to read it. Most of these conventions are in scanner.c, which is what reads the XML and dumps the C code and headers. Kristian ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel
Wire protocol questions
I've been reading the http://wayland.freedesktop.org/docs/html/sect-Protocol-Wire-Format.html and done some testing with the wayland socket. >From the docs I got the package sender object id, packet size and opcode. I also understand the argument formats. >From looking at the initial packets sent by the server and correlating with the arguments in wayland.xml ObjectID: 1 (interface "wl_registry"?) Opcode: 1 (event "global"?) Name: uint Interface: string Version: uint I'm guessing that these Names are the objectID to be used for further requests. Am I mixing up object ID and names here or are they the same? What I cannot figure out is how to map a request to an opcode integer. global happened to be the first event in the wl_registry interface, is the order in the xml-file dictating the opcodes? Also if there is any better source for this information such as a specific source code file I'm eager to read it. ___ wayland-devel mailing list wayland-devel@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/wayland-devel