Freigeben über


Using Kinect InteractionStream Outside of WPF

Last month with the release of version 1.7 of our SDK and toolkit we introduced something called the InteractionStream.  Included in this release were two new samples called Controls Basics and Interaction Gallery which, among other things, show how to use the new InteractionStream along with new interactions like Press and Grip.  Both of these new samples are written using managed code (C#) and WPF.

One question I’ve been hearing from developers is, “I don’t want to use WPF but I still want to use InteractionStream with managed code.  How do I do this?”  In this post I’m going to show how to do exactly that.  I’m going to take it to the extreme by removing the UI layer completely:  we’ll use a console app using C#.

The way our application will work is summarized in the diagram below:

image

 

There are a few things to note here:

  1. Upon starting the program, we initialize our sensor, interactions, and create FrameReady event handlers.

  2. Our sensor is generating data for every frame.  We use our FrameReady event handlers to respond and handle depth, skeleton, and interaction frames.

  3. The program implements the IInteractionClient interface which requires us to implement a method called GetInteractionInfoAtLocationwhich gives us back information about interactions happening with a particular user at a specified location:

     public InteractionInfo GetInteractionInfoAtLocation(int skeletonTrackingId, InteractionHandType handType, double x, double y)
     {
     var interactionInfo = new InteractionInfo
     {
     IsPressTarget = false,
     IsGripTarget = false
     };
    
     // Map coordinates from [0.0,1.0] coordinates to UI-relative coordinates
     double xUI = x * InteractionRegionWidth;
     double yUI = y * InteractionRegionHeight;
    
     var uiElement = this.PerformHitTest(xUI, yUI);
    
     if (uiElement != null)
     {
     interactionInfo.IsPressTarget = true;
    
     // If UI framework uses strings as button IDs, use string hash code as ID
     interactionInfo.PressTargetControlId = uiElement.Id.GetHashCode();
    
     // Designate center of button to be the press attraction point
     //// TODO: Create your own logic to assign press attraction points if center
     //// TODO: is not always the desired attraction point.
     interactionInfo.PressAttractionPointX = ((uiElement.Left + uiElement.Right) / 2.0) / InteractionRegionWidth;
     interactionInfo.PressAttractionPointY = ((uiElement.Top + uiElement.Bottom) / 2.0) / InteractionRegionHeight;
     }
    
     return interactionInfo;
     }
    
  4. The other noteworthy part of our program is in the InteractionFrameReady method.  This is where we process information about our users, route our UI events, handle things like Grip and GripRelease, etc.

 

I’ve posted some sample code that you may download and use to get started using InteractStream in your own managed apps.  The code is loaded with tips in the comments that should get you started down the path of using our interactions in your own apps.  Thanks to Eddy Escardo Raffo on my team for writing the sample console app.

Ben

@benlower | kinectninja@microsoft.com | mobile: +1 (206) 659-NINJA (6465)

Comments

  • Anonymous
    May 16, 2013
    hummm a made a kinect mouse with this example.... but and the PUSH / PULL events o gestures??? Only show how to work with GRIP...

  • Anonymous
    June 17, 2013
    It would be great to see this in VB.Net. I have converted most of the code, but the interaction stream isn't registering anything in the console window.

  • Anonymous
    October 02, 2013
    Diego, sorry for the very delayed response. To get at the "Pressed" vs "not pressed" state, plus press extent/progress you should look at the "IsPressed" and "PressExtent" properties of InteractionHandPointer object. So, in the sample code attached to this post you can add the following: if (handPointer.IsActive) {    Console.WriteLine(        "The {0} hand of user '{1}' is {2}, with a press progress of {3,0.00}",        handPointer.HandType,        info.SkeletonTrackingId,        handPointer.IsPressed ? "pressing" : "not pressing",        handPointer.PressExtent); }