[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Lorn Potter
I don't agree that it makes sense to disable one input method if another
is available. Make it a user configurable option to use one or the
other, but do not disable using both.

I have a touchscreen desktop with second screen. I want both mouse and
touchscreen available. I use both at the same time. What bothers me is
the virtual keyboard pops up on the non touch screen if that is in
focus.

With convergence phone, I think there might be more of a case for doing
this. But I would still want and need to be able to use the touchscreen
at times, even if I have a mouse attached.

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  Confirmed
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.

  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.



  
  = UX UPDATE ===
  This was discussed during today's (15th Dec) team UX review meeting.

  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more 
detail.
  - We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions

-- 
Mailing list: https://launchpad.net/~touch-packages
Post to : touch-packages@lists.launchpad.net
Unsubscribe : https://launchpad.net/~touch-packages
More help   : https://help.launchpad.net/ListHelp


[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Magdalena Mirowicz
** Changed in: ubuntu-ux
   Status: New => Triaged

** Changed in: ubuntu-ux
   Importance: Undecided => Medium

** Changed in: ubuntu-ux
 Assignee: (unassigned) => Andrea Bernabei (faenil)

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  New
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.

  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions

-- 
Mailing list: https://launchpad.net/~touch-packages
Post to : touch-packages@lists.launchpad.net
Unsubscribe : https://launchpad.net/~touch-packages
More help   : https://help.launchpad.net/ListHelp


[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Andrea Bernabei
This was discussed during today's (15th Dec) team UX review meeting.

The outcome of the meeting was:
- The UX team will start a research project to handle this matter in more 
detail.
- We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

** Description changed:

  It is possible, at the moment  (r199, krillin, rc-proposed), to use both
  touch and mouse at the same time.
  
  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;
  
  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.
  
  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:
  
  - in the default browser: interacting with the touch stops and hides the
  mouse, and gives priority to (multi)-touch gestures. The mouse pointer
  stays still (i.e. doesn't follow the fingers). The only way I found that
  let me take control of the mouse again was to perform a single tap and
  then wait a short amount of time before moving the mouse.
  
  - in other apps that did not feature special touch handling, interacting
  with the touch would still disable the mouse, but in this case the mouse
  pointer followed my finger (I guess this is a "if nothing consumes touch
  events -> then do mouse simulation")
  
  I believe this bug is a show stopper for the convergent experience.
  
  It is currently possible to trigger flickering and broken UX in multiple
  places in Unity8. Basically anything that relies on MouseMove events is
  broken and causes flickering.
  
  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...
  
  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices. I
  already researched this before writing this post, and I didn't see any
  way how MouseArea could (with the current APIs) be able to do that. That
  means, imho, a looong waiting time before we actually implement such a
  feature in Qt itself. Hence I proposed the solution above as a
  workaround, while we get all the rest of the pieces working as we
  expect.
+ 
+ 
+ 
+ 
+ = UX UPDATE ===
+ This was discussed during today's (15th Dec) team UX review meeting.
+ 
+ The outcome of the meeting was:
+ - The UX team will start a research project to handle this matter in more 
detail.
+ - We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  New
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default 

[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Andrea Bernabei
That would be quite a bad UX :)

Yes, I definitely meant *temporarily* disable one input device while the
other is being used, more or less like when you enable the palm
detection on your touchpad setting :)

** Summary changed:

- Touchscreen interactions should take priority over mouse and disable it
+ Touchscreen interactions should take priority over mouse and temporarily 
disable it

** Description changed:

  It is possible, at the moment  (r199, krillin, rc-proposed), to use both
  touch and mouse at the same time.
  
  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;
  
  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
+ That is, mouse and touch can of course be both plugged in, and both can be 
used, just not at the *very same* time. For instance, as long as the user is 
dragging a surface using the touchscreen, he should not be able to 
click/move/interact using the mouse at the same time. After the finger is 
released from the touchscreen, then the mouse can be used again (and viceversa).
+  
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.
  
  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:
  
  - in the default browser: interacting with the touch stops and hides the
  mouse, and gives priority to (multi)-touch gestures. The mouse pointer
  stays still (i.e. doesn't follow the fingers). The only way I found that
  let me take control of the mouse again was to perform a single tap and
  then wait a short amount of time before moving the mouse.
  
  - in other apps that did not feature special touch handling, interacting
  with the touch would still disable the mouse, but in this case the mouse
  pointer followed my finger (I guess this is a "if nothing consumes touch
  events -> then do mouse simulation")
  
  I believe this bug is a show stopper for the convergent experience.
  
  It is currently possible to trigger flickering and broken UX in multiple
  places in Unity8. Basically anything that relies on MouseMove events is
  broken and causes flickering.
  
  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...
  
  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices. I
  already researched this before writing this post, and I didn't see any
  way how MouseArea could (with the current APIs) be able to do that. That
  means, imho, a looong waiting time before we actually implement such a
  feature in Qt itself. Hence I proposed the solution above as a
  workaround, while we get all the rest of the pieces working as we
  expect.
  
- 
- 
- 
  = UX UPDATE ===
  This was discussed during today's (15th Dec) team UX review meeting.
  
  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more 
detail.
  - We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and
  temporarily disable it

Status in Canonical System Image:
  Confirmed
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles 

[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Jean-Baptiste Lallement
** Changed in: canonical-devices-system-image
   Status: New => Confirmed

** Changed in: canonical-devices-system-image
   Importance: Undecided => Medium

** Changed in: canonical-devices-system-image
 Assignee: (unassigned) => Zoltan Balogh (bzoltan)

** Changed in: canonical-devices-system-image
Milestone: None => backlog

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  Confirmed
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.

  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.



  
  = UX UPDATE ===
  This was discussed during today's (15th Dec) team UX review meeting.

  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more 
detail.
  - We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions

-- 
Mailing list: https://launchpad.net/~touch-packages
Post to : touch-packages@lists.launchpad.net
Unsubscribe : https://launchpad.net/~touch-packages
More help   : https://help.launchpad.net/ListHelp


[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Andrea Bernabei
Of course, you have to be able to use both the devices!

The point here is *temporarily* disabling one, i.e. disable mouse while
you're dragging with the touch, for instance.

You can still use touch, then mouse, then touch again.

Do you agree?

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  Confirmed
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.

  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.



  
  = UX UPDATE ===
  This was discussed during today's (15th Dec) team UX review meeting.

  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more 
detail.
  - We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions

-- 
Mailing list: https://launchpad.net/~touch-packages
Post to : touch-packages@lists.launchpad.net
Unsubscribe : https://launchpad.net/~touch-packages
More help   : https://help.launchpad.net/ListHelp


[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-15 Thread Lorn Potter
In that case, yes I agree.

If it means I can only use one or the other then no.

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and
  temporarily disable it

Status in Canonical System Image:
  Confirmed
Status in Ubuntu UX:
  Triaged
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.

  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.



  
  = UX UPDATE ===
  This was discussed during today's (15th Dec) team UX review meeting.

  The outcome of the meeting was:
  - The UX team will start a research project to handle this matter in more 
detail.
  - We all agreed it makes sense to prevent multiple input devices from being 
active at the same time, i.e. mouse disables touch, touch disables mouse. This 
is however just a quick consideration done during the meeting, the details have 
to be considered as part of the research project described in the previous 
point.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions

-- 
Mailing list: https://launchpad.net/~touch-packages
Post to : touch-packages@lists.launchpad.net
Unsubscribe : https://launchpad.net/~touch-packages
More help   : https://help.launchpad.net/ListHelp


[Touch-packages] [Bug 1525979] Re: Touchscreen interactions should take priority over mouse and disable it

2015-12-14 Thread MichaƂ Sawicz
** Also affects: qtbase-opensource-src (Ubuntu)
   Importance: Undecided
   Status: New

-- 
You received this bug notification because you are a member of Ubuntu
Touch seeded packages, which is subscribed to qtbase-opensource-src in
Ubuntu.
https://bugs.launchpad.net/bugs/1525979

Title:
  Touchscreen interactions should take priority over mouse and disable
  it

Status in Canonical System Image:
  New
Status in Ubuntu UX:
  New
Status in qtbase-opensource-src package in Ubuntu:
  New

Bug description:
  It is possible, at the moment  (r199, krillin, rc-proposed), to use
  both touch and mouse at the same time.

  Because of:
  - QtQuick's touch-to-mouse events synthesis feature;
  - the fact that most of QML code in this world relies on MouseArea to handle 
input (touch as well);
  - the fact that there is no QML component that handles both Touch and Mouse 
events and gives the developer a good API to handle both;
  - the fact that making both touch and mouse usable at the same time easily 
leads to unexpected and broken UX;

  I suggest we make it so that only one input device can be used at any given 
time by default (exceptional cases to be handled separately).
  Moreover, I think it would be a good idea to give touch events a priority 
over mouse events, i.e. mouse stops working when the user touches the screen, 
but not viceversa.

  I also think the final decision should take into account the conventions the 
users are already accustomed to.
  I played with a laptop that features touchscreen (Microsoft's Surface) and 
Win10, and here's what I found:

  - in the default browser: interacting with the touch stops and hides
  the mouse, and gives priority to (multi)-touch gestures. The mouse
  pointer stays still (i.e. doesn't follow the fingers). The only way I
  found that let me take control of the mouse again was to perform a
  single tap and then wait a short amount of time before moving the
  mouse.

  - in other apps that did not feature special touch handling,
  interacting with the touch would still disable the mouse, but in this
  case the mouse pointer followed my finger (I guess this is a "if
  nothing consumes touch events -> then do mouse simulation")

  I believe this bug is a show stopper for the convergent experience.

  It is currently possible to trigger flickering and broken UX in
  multiple places in Unity8. Basically anything that relies on MouseMove
  events is broken and causes flickering.

  A few examples:
  - username vertical scrolling in the login manager (just drag the username 
with your finger and then move the mouse).
  - window positioning (same as above)
  - indicators horizontal scrolling
  - scrolling in ANY Flickable/ListView based views inside applications and 
platform menus
  - side scrolling in the Dash
  - etc, etc etc...

  NOTE: after a discussion on IRC with Saviq, we agreed that it would be
  awesome if MouseArea would be able to handle different input devices.
  I already researched this before writing this post, and I didn't see
  any way how MouseArea could (with the current APIs) be able to do
  that. That means, imho, a looong waiting time before we actually
  implement such a feature in Qt itself. Hence I proposed the solution
  above as a workaround, while we get all the rest of the pieces working
  as we expect.

To manage notifications about this bug go to:
https://bugs.launchpad.net/canonical-devices-system-image/+bug/1525979/+subscriptions

-- 
Mailing list: https://launchpad.net/~touch-packages
Post to : touch-packages@lists.launchpad.net
Unsubscribe : https://launchpad.net/~touch-packages
More help   : https://help.launchpad.net/ListHelp