Udostępnij za pośrednictwem


Developing a WPF Application for Windows 7 Touch

In my previous post, I discussed pros and cons of different development platforms for Windows 7 touch applications. In this post, my goal is to discuss common development considerations and issues for a touch application when using WPF.  If you haven’t read the Windows Touch Guidance whitepaper, please review this document for good general guidance on developing a touch application.

How Much “Touch” is Enough?

One benefit of Windows touch is single touch and multitouch gestures are converted to mouse messages if they aren’t handled explicitly by your application. Therefore, your application may be somewhat touch friendly already.  Depending on the touch scenario you want to support, you may only need to tweak your application in a few areas and be able to provide a good touch user experience. A good reference to explain what makes an application touch “friendly”, “enabled” or “optimized” is the MSDN Touch Guidelines.

For the remainder of this post, I’ll focus on key touch considerations and suggest ways to implement using WPF.

Text Input Panel (TIP)

The TIP can be very handy if you want to allow users to enter text without a keyboard. Several “slate” PC’s now exist and either have an external keyboard or allow the user to configure the notebook to hide the keyboard. The TIP has existed since Tablet PC edition and is available in all versions of Windows 7. WPF will display the TIP automatically for most controls such as the TextBox control. If you touch the TextBox, you will see a TIP icon.

TipIcon

If you touch the TIP icon, the virtual keyboard is displayed

TipKeyboard

Almost all of WPF’s controls that accept text entry support TIP. However, PasswordBox does not support TIP. You will want to avoid using PasswordBox or implement your own way to input passwords without a keyboard.

Screen Rotation

The Windows 7 Engineering Guidance for Slate PCs recommends PC’s to enable screen rotation automatically through sensors or a manual hardware button. If you plan on targeting mobile touch PC’s, you should consider supporting screen rotation for your touch application. To support rotation, you generally detect the screen dimensions have changed and then change the layout.

One way to detect a screen rotation is to register for the DisplaySettingsChanged event.  The following is some sample code that registers for the DisplaySettingsChanged event. The DisplaySettingsChanged handler changes the Window title and size of a WrapPanel to change the control layout.

 public MainWindow()
{
    Microsoft.Win32.SystemEvents.DisplaySettingsChanged +=new System.EventHandler(displaySettingsChanged);
}

private void displaySettingsChanged(object sender, EventArgs e)
{
    if (System.Windows.SystemParameters.PrimaryScreenHeight > System.Windows.SystemParameters.PrimaryScreenWidth)
    {
        //Run the application in portrait

        this.Title = "Portrait View";
        wrapPanel1.Width = 600;
        wrapPanel1.Height = 800;
    }
    else
    {
        //Run the application in landscape

        this.Title = "Landscape View";
        wrapPanel1.Width = 800;
        wrapPanel1.Height = 600;
    }
}

 

Landscape view:

Landscape

Portrait view:

Portrait

For more information on detecting screen rotation, see Detecting Screen Orientation and Screen Rotation in Tablet PC Applications

Gestures, Manipulations, and Inertia

WPF is one of the easiest Windows development platforms that enable touch manipulation of objects.  The UIElement, UIElement3D, and ContentElement classes expose events that occur when a user touches an element. Several other controls support touch-enabled scrolling. The UIElement supports manipulation which is interpreted to scale, rotate, or translate. I won’t go into the details of implementing manipulations within this post because there is great information on this already. To get started, please review Walkthrough: Creating Your First Touch Application on TechNet.