North (formerly Thalmic Labs), the creator of the Myo armband, was acquired by Google in June 2020. Myo sales ended in October 2018 and Myo software, hardware and SDKs are no longer available or supported. Learn more.
MyoPilot is a free, standalone Windows application that lets you fly the Parrot AR.Drone 2.0 drone with your Myo armband. We're big on controlling drones here, so we invited the author, Andreas Degele, to sit down and talk about how he built his application.
My name is Andreas Degele, and I am studying Computer Science at the Cooperative State University Stuttgart in Germany. In the third year, just before the bachelors thesis (which I'm writing now), every student works on a small study project. My supervisor Sebastian suggested to pair an AR.Drone 2.0 with the Myo armband, and I was keen on implementing it.
The start of the project was a little delayed, because the Myo was stuck at the German customs. However, this gave me enough time to code the necessary infrastructure. In my case, this is a C# application for Windows, but I guess you are not interested in how to make a GUI, so I'll focus on mapping the commands from Myo to drone.
As you might know, Myo supports five different gestures (double-tap, spread fingers, fist, left and right wave), but the AR.Drone 2.0 has ten important commands (left, right, forward, backward, turn left and right, up, down, take-off and land). For that reason, it is not feasible to just assign a gesture for every command. Luckily, it is possible to combine a gesture with the orientation of the users arm. Piloting the drone now boils down to combine gestures, roll, pitch and yaw gathered from the Myo and map them in order to control roll, pitch, yaw and gaz (altitude) of the drone. I chose to have double-tap make the drone take-off and land, making a fist and move your arm will yield movement on the horizontal plane, and raising your arm while the fingers are spread will influence the height.
The gestures are pretty easy to get through the SDK, but the angles required a little bit of math.
First, I save a reference orientation when the user makes a fist or the spread fingers gesture. Then I calculate the difference to this reference in a loop with about 30 cycles a second. However, if we would pass these values to the drone as they are, it would be impossible to hover the drone in the air. This is because there is some noise in the orientation values, therefore the difference is never null. To compensate this, I added a "dead-zone" which zeroes each value when it is below a certain threshold. Now, there is a "hole" of zeroes in our range of values, so I applied a transformation to make a continuous range again. Don't worry, I won't torture you with the formula (although it is rather simple). One last thing is to make the controls a little more sensitive, because you either want to manoeuvre carefully or go full speed. Achieving this is as simple as cubing the result. And that's pretty much everything necessary to convert gestures and orientation to actual commands for the drone.
I hope you enjoyed this little peak on the technical backgrounds of MyoPilot. If you want to read more about it, visit the GitHub page where you can find the code and the full project report, Implementing Gesture Control for a Quadcopter. Also, don't forget to check out MyoPilot itself.