Hi,

1. I think we need to be clear on terminology here.
Touch events are events that platform (Qt in our case) sends when user touches the screen or move his fingers on the screen or lifts fingers from the screen. Gestures are sequence of touch events. Several touch events may produce only one gesture.

2. Touch events are propagated to JavaScript and can be used there. There are no gesture events that can be used in JavaScript. Because of this if the web page needs some gesture (flick, swipe, pinch etc) it has to implement its own gesture recognition in JavaScript. That is what happens on several Google pages (maps, gmail).

3. Yes, you are right. You have to first give the page the chance to deal with touch events. Although, you have to be careful here as page might have a gesture that override behavior of your app. For example, in you app pinch zooms the whole page, while pinch in JavaScript zooms just one div or something like this.
But in general you can do something like this:

---------------------
Assumption:
For each touch begin system will send one and only one touch end.
mouse events are sent to web engine only if touch event has only one touch point (tuch was done by just one finger).


Pseudo Code:

Platform_touch_begin_event_handler()
{
    touchBeginConsumed = sendTouchBeginEventToWebEngine()
    if (!touchBeginConsumed) {
        sendMouseMoveEventToWebEngine(); // to trigger mouse over
        sendMousePressEventToWebEngine();
    }
}

Platform_touch_update_event_handler()
{
    touchUpdateConsumed = sendTouchUpdateToWebEngine();
    if (!touchUpdateConsumed) {
       if (needMouseMoveEvents())
sendMouseMoveEventToWebEngine(); // to select text in text area, for example. // only for one finger touch
       if (needToScroll()) // for example, if it's one finger move.
            scrollPage();
    }
}

Platform_touch_end_handler()
{
    touchEndConsumed = sendTouchEndToWebengine();
    if (!touchBeginConsumed && !touchEndConsumed) {

        //optional?? It seems that iPhone does it.
        if (!contentChangedAfterMouseMove())
            sendMouseReleaseEventToWebEngine();
    }
}



Br,
Misha


On 10/17/2011 07:55 AM, ext Felipe Crochik wrote:
Hi Ariya,

That is the problem: how can I "separate" the touch events implemented by web sites (e.g. pinching on the map on google maps or swiping on a results page of a google images search to switch page) from "gestures for user interaction" (e.g. change the zoom factor, scroll the page)?

It seems that I should first give the "page" a chance to deal with the gestures and only if they are not "needed" have them interact with the whole view. Every "sample" I have seen will start by intercepting any user mouse/touch/gesture events and just forward to "webkit" the mouse clicks.

p.s. I really enjoy reading your blog and many times came across your work. Thanks!

Felipe

On Sun, Oct 16, 2011 at 11:15 PM, Ariya Hidayat <[email protected] <mailto:[email protected]>> wrote:

    On Sun, Oct 16, 2011 at 6:40 PM, Felipe Crochik
    <[email protected] <mailto:[email protected]>> wrote:
    > I can't seem to find a definitive answer where Qt Webkit "can"
    or can't
    > support gestures (pinch, swipe, ...)

    Do you mean the gestures for user interactions? For example, pinch can
    be used to zoom in and out, swipe to flick the view, etc. In that
    case, those gestures are application-specific gestures and should be
    handled at the application level, e.g. the browser which uses
    QtWebKit.

    If what you mean is touch events (http://www.w3.org/TR/touch-events/),
    then see http://trac.webkit.org/wiki/QtWebKitFeatures21.




    --
    Ariya Hidayat, http://ariya.ofilabs.com




_______________________________________________
webkit-qt mailing list
[email protected]
http://lists.webkit.org/mailman/listinfo.cgi/webkit-qt

_______________________________________________
webkit-qt mailing list
[email protected]
http://lists.webkit.org/mailman/listinfo.cgi/webkit-qt

Reply via email to