Home · All Classes · All Functions · Overviews

Gestures Programming

Qt includes a framework for gesture programming that gives has the ability to form gestures from a series of events, independently of the input methods used. A gesture could be a particular movement of a mouse, a touch screen action, or a series of events from some other source. The nature of the input, the interpretation of the gesture and the action taken are the choice of the developer.

Overview

QGesture is the central class in Qt's gesture framework, providing the API used by classes that represent specific gestures, such as QPanGesture, QPinchGesture, and QSwipeGesture. These standard classes are ready to use, and each exposes functions and properties that give gesture-specific information about the user's input. This is described in the Using Standard Gestures With Widgets section.

QGesture is also designed to be subclassed and extended so that support for new gestures can be implemented by developers. Adding support for a new gesture involves implementing code to recognize the gesture from incoming events. This is described in the Creating Your Own Gesture Recognizer section.

Using Standard Gestures with Widgets

Gesture objects are applied directly to widgets and other controls that accept user input — these are the target objects. When a gesture object is constructed, the target object is typically passed to the constructor, though it can also be passed as the argument to the setGestureTarget() function.


In the above code, the gesture is set up in the constructor of the target object itself, so the argument to the QSwipeGesture constructor is this.

When the user performs a gesture, various signals may be emitted by the gesture object. To monitor the user's actions, you need to connect signals from the gesture object to slots in your code.


Here, the triggered() signal is used to inform the application that a gesture was used. More precise monitoring of a gesture can be implemented by connecting its started(), canceled() and finished() signals to slots.

Responding to a signal is simply a matter of obtaining the gesture that sent it and examining the information it contains.


Here, we examine the direction in which the user swiped the widget and modify its contents accordingly.

Using Standard Gestures with Graphics Items

The approach used for applying gestures to widgets can also be used with graphics items. However, instead of passing a target object to the gesture object's constructor, you set a target graphics item with the setGraphicsItem() function.

Creating Your Own Gesture Recognizer

QGesture is a base class for a user defined gesture recognizer class. In order to implement the recognizer you will need to subclass the QGesture class and implement the pure virtual function filterEvent() to filter out events that are not relevant to your gesture.

Once you have implemented the filterEvent() function to make your own recognizer you can process events. A sequence of events may, according to your own rules, represent a gesture. The events can be singly passed to the recognizer via the filterEvent() function or as a stream of events by specifying a parent source of events. The events can be from any source and could result in any action as defined by the user. The source and action need not be graphical, though that would be the most likely scenario. To find how to connect a source of events to automatically feed into the recognizer see the QGesture documentation.

Recognizers based on QGesture can emit any of the following signals to indicate their progress in recognizing user input:

These signals are emitted when the state changes with the call to updateState(), more than one signal may be emitted when a change of state occurs. There are four GestureStates

New StateDescriptionQGesture Actions on Entering this State
Qt::NoGestureInitial valueemit canceled()
Qt::GestureStartedA continuous gesture has startedemit started() and emit triggered()
Qt::GestureUpdatedA gesture continuesemit triggered()
Qt::GestureFinishedA gesture has finished.emit finished()

Note: started() can be emitted if entering any state greater than NoGesture if NoGesture was the previous state. This means that your state machine does not need to explicitly use the Qt::GestureStarted state, you can simply proceed from NoGesture to Qt::GestureUpdated to emit a started() signal and a triggered() signal.

You may use some or all of these states when implementing the pure virtual function filterEvent(). filterEvent() will usually implement a state machine using the GestureState enums, but the details of which states are used is up to the developer.

You may also need to reimplement the virtual function reset() if internal data or objects need to be re-initialized. The function must conclude with a call to updateState() to change the current state to Qt::NoGesture.

The ImageViewer Example

To illustrate how to use QGesture we will look at the ImageViewer example. This example uses QPanGesture, a standard gesture, and an implementation of TapAndHoldGesture. Note that TapAndHoldGesture is platform-dependent.

 void TapAndHoldGesture::reset()
 {
     timer.stop();
     iteration = 0;
     position = startPosition = QPoint();
     updateState(Qt::NoGesture);
 }

In ImageViewer we see that the ImageWidget class uses two gestures: QPanGesture and TapAndHoldGesture. The QPanGesture is a standard gesture which is part of Qt. TapAndHoldGesture is defined and implemented as part of the example. The ImageWidget listens for signals from the gestures, but is not interested in the started() signal.

 private slots:
     void gestureTriggered();
     void gestureFinished();
     void gestureCancelled();

TapAndHoldGesture uses QTouchEvent events and mouse events to detect start, update and end events that can be mapped onto the GestureState changes. The implementation in this case uses a timer as well. If the timeout event occurs a given number of times after the start of the gesture then the gesture is considered to have finished whether or not the appropriate touch or mouse event has occurred. Also if a large jump in the position of the event occurs, as calculated by the manhattanLength() call, then the gesture is canceled by calling reset() which tidies up and uses updateState() to change state to NoGesture which will result in the canceled() signal being emitted by the recognizer.

ImageWidget handles the signals by connecting the slots to the signals, although canceled() is not connected here.

     panGesture = new QPanGesture(this);
     connect(panGesture, SIGNAL(triggered()), this, SLOT(gestureTriggered()));

     tapAndHoldGesture = new TapAndHoldGesture(this);
     connect(tapAndHoldGesture, SIGNAL(triggered()), this, SLOT(gestureTriggered()));
     connect(tapAndHoldGesture, SIGNAL(finished()), this, SLOT(gestureFinished()));

These functions in turn will have to be aware of which gesture object was the source of the signal since we have more than one source per slot. This is easily done by using the QObject::sender() function as shown here

 void ImageWidget::gestureTriggered()
 {
     if (sender() == panGesture) {

As triggered() signals are handled by gestureTriggered() there may be position updates invoking calls to, for example, goNextImage(), this will cause a change in the image handling logic of ImageWidget and a call to updateImage() to display the changed state.

Following the logic of how the QEvent is processed we can summmarize it as follows:


Copyright © 2009 Nokia Corporation and/or its subsidiary(-ies) Trademarks
Qt 4.6.0