Partager via


The Motion Tracking Robot Controller

The romote control has come along way from the first remote patent by Nikola Tesla in 1898 and the first wireless TV remote “The Flashmatic” that used photon cells as shown in the picture below.

Fast forward 100  years and remote controls are breaking new ground for Human-Robotic Interaction (HRI) by leveraging Natural User Interface (NUI) allowing users to carry out relatively natural motions, movements or gestures which in return control computer applications, manipulate on-screen content or in this case control a robot.

How it works

EDDIE, the reference platform used for this demo comes with an 8-core Propeller microcontroller to directly control two 12v motors. These motors can be control remotely or Eddie can roam autonomously by leveraging several sensors around the robot and see in 3D using Microsoft Kinect. Gershon Parent , a developer with the Microsoft Robotics group, has added a new twist on how EDDIE can be wirelessly controlled  which he’s dubbed the “Motion Tracking Robot Controller”.  By leveraging skeletal tracking through a Kinect sensor Gershon can control the two 12v motors through arm gestures navigating EDDIE through his environment.

[View:https://www.youtube.com/watch?v=mCO-FF8oQs0&feature=youtu.be]

When standing in front of the Kinect sensor, Gershon’s right hand controls the right motor and his left hand controls the left motor; it’s kind of like a tank driver. When he raises both arms simultaneously the robot will move forward in a straight line and the higher he raises his hands the faster the robot will go. To put the robot in reverse he   simply lowers both of his arms at the same time. To turn the robot all he has to do is tilt his hands one over the other from side to side to give the robot the desired degree of turn. 

What’s next?

Gershon’s demo is a great example on how NUI can be used to control a robot, but it’s just the tip of the iceberg. There are many different types of controls that can be built through a Kinect interface – a steering wheel or joystick type of control, for example. Also, as shown in the video the robot has a Kinect sensor on it where it could be sensing its own environment, detecting obstacles and relaying this information back to the user. EDDIE’s Control Board provides additional I/O allowing a wide variety of sensors and accessories like cameras as seen before in Roborazzi.

Get involved

Do you have an idea like the Motion Tracking Robot Controller? We would love to hear from you and you can submit an entry in our Robotics @Home Contest. And if you’re already using RDS4, we hope you’ll join our  developer community for any technical assistance you might need. And finally, we are always looking to hear from our community; reach out to us anytime on Facebook and Twitter if you want to learn more about what’s going on at Microsoft Robotics or to Geek out on robots with us.

Comments

  • Anonymous
    March 24, 2012
    Do you have these projects downloadable anywhere?  I'd be great to see the code.

  • Anonymous
    December 13, 2013
    I have the MEDUSA by parallax...I wanted to build a "follow me" robot for my great grandson, but near the end of completion I discovered that I had no software ???  I saw a movie that showed such a robot that was built by a Microsoft team.  Is this software available anywhere?   I really know nothing about software, however I did build a BOE-BOT that used PBASIc and I had just enough nohow to contril it.  I  would be willing to pay a modest fee for the complete works that Nicrosoft built.

  • Anonymous
    May 16, 2015
    can u send me the code at MAITIsoumia@gmail.com pleaze