Freigeben über


Multitouch Part 2: Support for Gestures in Windows 7

As mentioned in yesterday’s multitouch post, there are a number of gestures that are recognized by Windows 7 out of the box:

  • Pan (also called Translate) – put a finger or fingers down and drag
  • Rotate – touch with two fingers at opposite ends and turn fingers in a circle
  • Zoom – touch with two fingers at opposite ends and move the fingers closer or further apart
  • Tap – touching and quickly lifting with a finger; equivalent to a click
  • Double-tap – quickly tapping twice; equivalent to a double-click
  • Press and tap (also called Finger Roll) – place one finger on the screen, place second finger on the screen, lift the second finger immediately, and then lift the first finger.  This is essentially holding one finger down while tapping with a second finger.  This gesture, by default, is equivalent to a right-click. 

Now, let’s look at how to code for gestures. 

How can I tell if the machine supports multitouch? 

First, it would be good to know if the machine on which your application is running supports multitouch, so if it doesn’t, you can degrade gracefully. 

 #include <windows.h>

// Test for touch
bool bMultiTouch = false;
int value = GetSystemMetrics(SM_DIGITIZER);
if (value & 0x40)
{
bMultiTouch = true; /* digitizer is multitouch */
}

There is a method called GetSystemMetrics that retrieves system configuration settings.  If you pass in SM_DIGITIZER, it will return a bit field with the following settings:

Value Hex value Meaning
TABLET_CONFIG_NONE 0x00 The input digitizer does not have touch capabilities.

NID_INTEGRATED_TOUCH

0x01

The device has an integrated touch digitizer.

NID_EXTERNAL_TOUCH

0x02

The device has an external touch digitizer.

NID_INTEGRATED_PEN

0x04

The device has an integrated pen digitizer.

NID_EXTERNAL_PEN

0x08

The device has an external pen digitizer.

NID_MULTI_INPUT

0x40

The device supports multiple sources of digitizer input.

NID_READY

0x80

The device is ready to receive digitizer input.

There are some limitations for GetSystemMetrics.  For example, there is no support for plug and play.  Therefore, be careful of using this function for permanent configuration.  If you add a multitouch digitizer later, this function would need to be called again to know that multitouch is supported. 

 

How can my application recognize that a gesture has occurred? 

By default, your window will receive notifications when gestures occur, in the form of WM_GESTURE messages. 

A window can receive gestures or raw touches, but not both.  If you want to work at the raw touch level as opposed to the gesture level, you can call RegisterTouchWindow.  You will then stop receiving WM_GESTURE messages and instead receive WM_TOUCH messages. 

What if I want my application to respond to only one or two of the known gestures, and ignore the rest?

By default, the application receives all gesture messages.  However, you may have an application in which the user can move a marker on a game board around on the screen, but you don’t want the marker to be resized.  In that case, you don’t care about the Zoom gesture, but you perhaps do want the Translate and Rotate gestures.  You can configure which gestures will be sent using SetGestureConfig.  It takes an array of GestureConfig structures, which each contain an ID, messages to enable (the dwWant parameter), and messages to disable (the dwBlock parameter).  This will change the gesture configuration for the lifetime of the window, not just for the next gesture. 

Here’s an example.  I create a GestureConfig which blocks nothing and wants all gestures. The GestureConfig struct is passed into the SetGestureConfig method.

 GESTURECONFIG gestureConfig;
gestureConfig.dwID = 0;
gestureConfig.dwBlock = 0;
gestureConfig.dwWant = GC_ALLGESTURES;

SetGestureConfig(hWnd, 0, 1,
            &gestureConfig, 
            sizeof(gestureConfig));

You can also dynamically change your gesture configuration.  The WM_GESTURENOTIFY message is sent to your window to indicate that a gesture is about to be sent, which gives you an opportunity to set your gesture configuration. 

OK, now I’m set up to get the gestures I want. How can my application respond to gestures?

You will receive notifications that gestures occurred as WM_GESTURE messages.  Use a switch statement to discover what gesture you received, and respond appropriately.

Information about the gesture is stored in the GESTUREINFO structure. 

 // Create a structure to populate and retrieve the extra message info.
GESTUREINFO gi;  
ZeroMemory(&gi, sizeof(GESTUREINFO));
gi.cbSize = sizeof(GESTUREINFO);
BOOL bResult  = GetGestureInfo((HGESTUREINFO)lParam, &gi);

BOOL bHandled = FALSE;

if (bResult)
{
    // now interpret the gesture
    switch (gi.dwID){
       case GID_ZOOM:
           // Code for zooming goes here     
           bHandled = TRUE;
           break;
       case GID_PAN:
           // Code for panning goes here
           bHandled = TRUE;
           break;
       case GID_ROTATE:
           // Code for rotation goes here
           bHandled = TRUE;
           break;
       case GID_TWOFINGERTAP:
           // Code for two-finger tap goes here
           bHandled = TRUE;
           break;
       case GID_PRESSANDTAP:
           // Code for roll over goes here
           bHandled = TRUE;
           break;
       default:
           // A gesture was not recognized
           break;
    }
}
else
{
    // Handle error...
}

In tomorrow’s post, we will talk about managed code support for multitouch. 

Other blog posts in this series:

Multitouch Part 1: Getting Started with Multitouch in Windows 7

Multitouch Part 2: Support for Gestures in Windows 7

Multitouch Part 3: Multitouch in managed code and WPF

Multitouch Part 4: Multitouch in Silverlight

Multitouch Part 5: User Experience with Multitouch

Comments

  • Anonymous
    December 16, 2010
    I've been working with this API recently, and I'm wondering about the bits returned from GetSystemMetrics(SM_DIGITIZER). What do NID_INTEGRATED_TOUCH and NID_EXTERNAL_TOUCH mean in real terms? What kind of hardware is that? Or does it matter for gesture support?

  • Anonymous
    December 16, 2010
    So when NID_MULTI_INPUT is not set in SM_DIGITIZER, can I assume my hwdn will never receive WM_GESTURE and WM_TOUCH ?

  • Anonymous
    December 17, 2010
    @Scott - They just specify whether the touch digitizer is integrated with your machine/device or not.  For example, I have an HP TouchSmart laptop with a touch screen; this will return NID_INTEGRATED_TOUCH because it's a baked-in part of my laptop screen.  If you use one of those touch screens that you can buy separately and connect to your machine via USB, this is considered NID_EXTERNAL_TOUCH.  

  • Anonymous
    September 07, 2014
    I am trying to make this work on MFC dialog based app where the dialog is CDHtmlDialog based. Are there any extra steps to make it work on html based MFC dialog? Pls let me know.

  • Anonymous
    October 27, 2015
    I'm having problems finding info on this even today in 2015, so was pleased to find your page. I have an MFC app on Win 10 Home, tablet, I can get touch messages, but none of the WM_GESTURE type messages are arriving, even checked on PreTranslate. I also can't link GetGestureStatus, I have it in the includes, but it fails to find it at link time... I have WinVer at 0x601 to keep compatibility with some of my older code. Any ideas where I should look?