Trolltech | Documentation | Qt Quarterly | « The Property Browser Framework | Deploying to the Bazaar »

Recognizing Mouse Gestures
by Johan Thelin
For fast typists, there are keyboard shortcuts. These are fully supported by Qt and easy to set up. The mouse-oriented equivalent of keyboard shortcuts is mouse gestures. In this article, I will present a few classes that you can use to support mouse gestures in your Qt application.

Mouse gesture support is becoming increasingly common, not only on PDAs but also in desktop applications such as Opera and Mozilla Firefox, not to mention games (e.g., Black & White). The concept is simple: The user gestures a shape while holding a mouse button, and if the shape is recognized the associated action is executed.

We will start by looking at a basic mouse gesture framework I have developed using Qt 4. We will then examine how to use the framework from a Qt application.

[Download source code]

The Recognition Algorithm

Recognizing a mouse gesture isn't as difficult as it may sound, because it can be reduced to four fairly straightforward steps: filtering, limiting, simplifying, and matching.

Step 1: Filtering the mouse movement

The filtering is applied while a mouse movement (a segment in the gesture) is recorded. It prevents very short mouse movements from being recorded. This step is necessary because different hardware reports mouse positions at different rates, and because the rest of the algorithm might fail if the movement is too small.

Filtering a gesture

Step 2: Limiting the directions

The limiting step is about determining what the user actually meant. This is done by limiting each segment of the gesture to one of the four directions up, down, left, or right. More precisely, we need to compare the x and y components of each segment and zero out the smaller of the two.

Limiting a gesture

Step 3: Simplifying the direction list

The third step, simplifying, consists of finding consecutive movements in the same direction and joining these. This gives us a list with the directions that make up the gesture (e.g., "right, up, right, up"). This list can then be matched against predefined gestures.

Simplifying a gesture

Step 4: Matching and reducing

The matching part is the most difficult part of the algorithm, since it is common for users to change the direction slightly as they release the mouse button. In addition, we must deal with small glitches along long movements.

The algorithm starts by trying to match the gesture against a predefined list. If that fail, we remove the shortest segment from the gesture and try again. This is repeated until a match is found or a too large proportion of the original gesture has been removed.

Matching a gesture

The Qt Classes

The mouse gesturing framework described in this article makes it easy to add gesture support to existing Qt applications. The Qt-specific part of the framework consists of two classes:

Let's start with the QjtMouseGestureFilter class definition:

    class QjtMouseGestureFilter : public QObject
    {
    public:
        QjtMouseGestureFilter(Qt::MouseButton button = Qt::RightButton,
                              QObject *parent = 0);
        ~QjtMouseGestureFilter();
    
        void addGesture(QjtMouseGesture *gesture);
        void clearGestures(bool deleteGestures = false);
    
    protected:
        bool eventFilter(QObject *obj, QEvent *event);
        ...
    };

To use it, we simply need to create an instance, call installEventFilter() on the windows or widgets that should support gesture input, and populate it with a list of mouse gestures. Here's the definition of the gesture class:

    class QjtMouseGesture : public QObject
    {
        Q_OBJECT
    
    public:
        QjtMouseGesture(const DirectionList &directions,
                        QObject *parent = 0);
        ~QjtMouseGesture();
    
        const DirectionList directions() const;
    
    signals:
        void gestured();
    
    private:
        ...
    };

A gesture is essentially a list of directions and a signal. The available directions are Up, Down, Left, Right, AnyHorizontal, AnyVertical, and NoMatch. AnyHorizontal means "left or right", whereas AnyVertical means "up or down". The NoMatch direction is used to create a gesture that is matched if no other gesture is matched.

The Direction enum that defines the available directions is part of the Gesture namespace, described in the next section. To save a few keystrokes, we use the following typedefs and using directives:

    typedef Gesture::Direction Direction;
    
    using Gesture::Up;
    using Gesture::Down;
    using Gesture::Left;
    using Gesture::Right;
    using Gesture::AnyHorizontal;
    using Gesture::AnyVertical;
    using Gesture::NoMatch;
    
    typedef QList<Direction> DirectionList;
Bridging the Gap

The actual mouse gesture recognition classes are available as a set of framework-neutral classes in the Gesture namespace. The Qt specific classes map Qt's event filtering mechanism to a set of framework-independent mouse-following functions, and a callback class to a Qt signal. The neutral classes corresponding to QjtMouseGesture are shown below:

    typedef enum { Up, Down, Left, Right, AnyHorizontal,
                   AnyVertical, NoMatch } Direction;
    typedef std::list<Direction> DirectionList;
    
    class MouseGestureCallback
    {
    public:
        virtual void callback() = 0;
    };
    
    struct GestureDefinition
    {
        GestureDefinition(const DirectionList &d, MouseGestureCallback *c)
            : directions(d), callbackClass(c) {}
    
        DirectionList directions;
        MouseGestureCallback *callbackClass;
    };

To bridge the gap with Qt, we must subclass MouseGestureCallback and reimplement callback() as follows:

    class GestureCallbackToSignal
            : public Gesture::MouseGestureCallback
    {
    public:
        GestureCallbackToSignal(QjtMouseGesture *object) {
            m_object = object;
        }
    
        void callback() {
            m_object->emitGestured();
        }
    
    private:
        QjtMouseGesture *m_object;
    };

To allow the bridging class to emit the gestured() signal on behalf of the QjtMouseGesture object, we use the emitGestured() private function. This requires GestureCallbackToSignal to be declared as a friend of QjtMouseGesture. The private section of QjtMouseGesture looks like this:

    class GestureCallbackToSignal;
    
    class QjtMouseGesture : public QObject
    {
        ...
    
    private:
        friend class GestureCallbackToSignal;
        void emitGestured();
    
        DirectionList m_directions;
    };

The framework-neutral class corresponding to QjtMouseGestureFilter is called MouseGestureRecognizer:

    class MouseGestureRecognizer
    {
    public:
        MouseGestureRecognizer(int minimumMovement = 5, double minimumMatch = 0.9);
        ~MouseGestureRecognizer();
    
        void addGestureDefinition(
                const GestureDefinition &gesture);
        void clearGestureDefinitions();
    
        void startGesture(int x, int y);
        void addPoint(int x, int y);
        void endGesture(int x, int y);
        void abortGesture();
    
    private:
        ...
        class Private;
        Private *d;
    };

The startGesture(), addPoint(), and endGesture() functions are invoked from the event filtering part of the QjtMouseGestureFilter class. The actual bridging takes place in addGesture():

    void QjtMouseGestureFilter::addGesture(
            QjtMouseGesture *gesture)
    {
        Gesture::DirectionList dl = gesture->directions().toStdList();
    
        d->bridges.append(GestureCallbackToSignal(gesture));
        d->gestures.append(gesture);
    
        d->mgr.addGestureDefinition(
              Gesture::GestureDefinition(dl, &d->bridges.last()));
    }

We copy all the directions from a QList to an STL list, then we wrap the gesture into a GestureCallbackToSignal instance; finally we add a GestureDefinition to the MouseGestureRecognizer.

The bridges and gestures added to the QjtMouseGestureFilter are held in the private member variable d along with the MouseGestureRecognizer instance.

Designing Gestures

Designing mouse gestures consists in defining a list of directions such as "up, left". When designing gestures for an application, we must always keep the matching process in mind. For example, defining gestures that differ by only one segment is risky because the user could easily trigger the wrong action by doing a small unintentional move.

In general, mouse gestures are only helpful if they are easy to remember and if the application is mouse-oriented. For example, it is usually pointless to define a gesture for opening a dialog that requires keyboard input.

Bringing Gestures to the Application

From the perspective of the Qt application programmer, what is really interesting is how we can bring gestures to an application. To illustrate this, we will use the simple MainWindow class shown below as an example.

A gesture dialog

Here's the class definition:

    class MainWindow : public QMainWindow
    {
        Q_OBJECT
    
    public:
        MainWindow();
    
    public slots:
        void clearAll();
        void setAll();
    
        void noMatch();
    
    private:
        ...
    };

The original main() function, with no gesture support, looks like this:

    int main(int argc, char *argv[])
    {
        QApplication app(argc, argv);
        MainWindow mainWin;
        mainWin.show();
        return app.exec();
    }

To add gestures to the application, we must create an instance of QjtMouseGestureFilter and install it on the MainWindow. We must also define gestures and hook them up to the application. For example:

    int main(int argc, char *argv[])
    {
        QApplication app(argc, argv);
        MainWindow mainWin;
    
        QjtMouseGestureFilter filter;
        mainWin.installEventFilter(&filter);
    
        // Clear all by making three sideways moves
        QjtMouseGesture g1(DirectionList() << AnyHorizontal
                          << AnyHorizontal << AnyHorizontal);
        filter.addGesture(&g1);
        connect(&g1, SIGNAL(gestured()), &mainWin, SLOT(clearAll()));
    
        // Set all by moving up, then left
        QjtMouseGesture g2(DirectionList() << Up << Left);
        filter.addGesture(&g2);
        connect(&g2, SIGNAL(gestured()), &mainWin, SLOT(setAll()));
    
        mainWin.show();
        return app.exec();
    }

To make the application slightly more user friendly, we can add a noMatch() slot to MainWindow and associate it with a special "no match" gesture:

    int main(int argc, char **argv)
    {
        ...
        // When nothing else matches
        QjtMouseGesture g3(DirectionList() << NoMatch);
        filter.addGesture(&g3);
        connect(&g3, SIGNAL(gestured()),
                &mainWin, SLOT(noMatch()));
        ...
    }
Running the Example

So far, we have talked about how the framework recognizes gestures, and taken a brief tour of the code for a simple example. Building this example should just be a simple case of running qmake and make.

Now, let's run the example. The window (shown in the opposite column) displays five checkboxes that will be set or unset when an appropriate gesture is used. If you hold down the right mouse button – the default gesture button in the QjtMouseGestureFilter constructor – the mouse movements will be recorded. Releasing the mouse button causes recording to stop, and the example will either update the checkboxes if it recognized the gesture, or it will display a dialog reminding you which gestures it knows about.

One recognized gesture involves moving the mouse up then to the left; another requires three horizontal movements. There's also a secret gesture that you'll have to discover for yourself!

Conclusions and Possible Improvements

With the mouse gesture framework presented in this article, mouse gesture support can easily be added to any Qt application. The framework-independent gesture-recognizing code has been wrapped in a Qt interface that makes the integration with other Qt classes seamless. Qt-specific enhancements could include support for QSettings-based storage of gestures, provision of visual feedback to help users learn new gestures, and even gesture editing facilities.

The gesture recognizing algorithm outlined here is simple to get started with but has some limitations. In particular, it doesn't support diagonal directions or relative lengths. Pen-based user interfaces such as PDAs and tablet computers would benefit most from such improvements.


This document is licensed under the Creative Commons Attribution-Share Alike 2.5 license.

Copyright © 2006 Trolltech Trademarks