This is a guest post written by Myo Alpha Developer, Pascal Auberson. Pascal Auberson is from London, UK and is the founder of Lumacode, a consultancy specialising in creative development for web, mobile, and installations. Before that he was co-founder and technical director of Specialmoves, an award-winning digital production studio. Currently, he’s investigating better ways to navigate and interact in VR.

Overall, my goal is to improve the way people interact with virtual worlds. The Myo armband seemed like an input device that could do just that, thanks to its wireless ability to detect hand gestures and orientation. In most VR setups, both hands are needed for navigation, so the first thing I decided to tackle was to see if all of these controls could be moved to a single hand, leaving the other hand free for interaction. With the combination of an iPhone, an Oculus Rift, and a Myo armband, I was able to do just that.

In my demo video I used an iPhone as the controller for navigation within the Oculus Rift VR world. The touch screen emulates a traditional analog stick control and deals with 2d movement: walking/running forwards and back, and strafing left and right. I then utilised the gyroscope roll to control body rotation. These touch and rotation events were sent via a UDP socket directly to Unity over WiFi. I then modified the Oculus Character Controller script in Unity to accept these new navigation controls. I was quite comfortable using my left hand for navigation, so my right hand was then free to use the Myo armband to interact in the world.

PascalwithiPhone

I thought it would be great if I could throw grenades using my Myo-enabled hand so I had a quick look around the Unity Asset Store. I soon found a suitable model grenade for only a few British pounds and a great explosion framework for free. I set up a Unity test scene with the grenade and explosion and got all that working with Unity’s built-in physics.

I then needed to get the orientation data from the Myo armband into Unity. I already had the socket connection working nicely with Unity for navigation, so I decided to use the iOS SDK for the Myo armband and pass the Myo data via the same UDP socket. The Myo sample Xcode project had all the code I needed and it was simple to copy the code and framework into my project. After setting a linker flag, the framework was ready to go.

The gesture events and orientation information were then coming into Unity smoothly so I just needed to get the throwing interaction working. After a little experimentation, performing a fist gesture seemed natural for creating a new grenade. At this point, all I had to do was detect which direction the grenade was moving in when released.

PascalwithMyo

In order to indicate that the fist gesture had worked properly, I wanted the grenade to appear precisely where my fist was in real life. I then wanted to have the ability to move around realistically with the grenade within the virtual world. There were quite a few issues with this, although I did manage to get them all resolved in the end.

First off, I needed to convert the Myo orientation quaternion into a Unity quaternion – luckily for me, one of the other Myo Alpha developers had recently done this and posted the Unity code to the Alpha forum. The second issue was that the Myo armband’s yaw rotation is measured relative to North, and in my application, I needed to cancel that out. To do so, I added a reset button so that I could measure the yaw rotation at an appropriate point and subtract that from subsequent orientations. I then had to create a rig of nested GameObjects in Unity to position the arm correctly relative to the body. I made a simplified arm model, from elbow to wrist only, which meant no shoulder or wrist rotation, but it still worked fine. To detect the throw I calculated the difference in grenade position from the last frame; if the magnitude of this was greater than a certain number, I triggered the creation of an exploding grenade and hid the grenade in the hand.

PascalGesturing1

Overall, I was very pleased with the end results and felt that it showed the potential of using natural gestures and interactions within VR. For the future, it would make sense to detect the Myo Bluetooth events directly within Unity rather than via the iPhone app as these controllers should be independent. As for my next project, I’d like to make a more realistic model of an arm with inverse kinematic constraints.

To watch this video again, and other videos from our Alpha Developers, check out our Myo Alpha Developer YouTube Playlist.

Newsletter

Enter your email address and get all latest content delivered to your inbox every now and then.