North (formerly Thalmic Labs), the creator of the Myo armband, was acquired by Google in June 2020. Myo sales ended in October 2018 and Myo software, hardware and SDKs are no longer available or supported. Learn more.
This is a guest post written by Myo Alpha Developer, Gabor Szanto. Gabor Szanto is from Hungary and is currently the CTO of Superpowered, and the developer behind the DJ Player app.
Emerging novel user input methods hold a lot of promise when it comes to ways we can better interact with our environment, especially in regards to the up and coming space of wearable tech. I was fortunate enough to be selected as a developer for the Myo™ Alpha program and have made it my goal to have the Myo armband integrated into the professional DJ booth.
While other novel input devices may rely on cameras for gesture control, or microphones for voice recognition, the Myo armband is able to interact using only gestures, making it much more effective in an environment with flashing lights, fog, and loud sounds. The Myo iOS SDK is really easy to understand, the documentation is simple to read, and you don’t need to deal with CoreBluetooth code at all. The HelloMyo example project shows you how to use every feature, and you receive events through delegate methods.
Originally my plan was to develop novel DJ gestures and implement these for control, but I quickly realized this wouldn’t be possible. Not due to the actual hardware, but because as it turns out, I’m actually human. What I mean by this is that the Myo armband delivers accurate data at 50 fps, but the fundamental problem is that the way my body reports my arm’s position back to my brain is radically inaccurate. Meaning the average human has no idea what his arm is doing within any real accuracy. For example, when I put on the Myo armband and pointed it to the horizon, I would think that my arm is parallel to the ground. However, the Myo data showed me that I was easily off by as much as 30 degrees.
The Myo Armband as a Virtual Instrument
I started to work on a DJ environment where virtual “instruments” surrounded me. For example, a loop roll on the left, flanger on the right, along with a few poses here and there for specific various DJ moves. If this sounds familiar, this type of interaction is comparable to the musical gloves created by the artist, Imogen Heap.
As it turned out, this approach doesn’t work well with the Myo armband and DJ booth. At all. I tried for about 2 weeks and although it might be great for other sorts of musical performers, it’s simply not robust nor forgiving enough for DJ use due to the need to set the orientation and its tendency to easily misalign.
Another issue was finding the right, yet foolproof gestures. Some gestures work great, but if you repeat the wrong gesture hundreds of times during the night, you’d better be ready for an arm workout.
Sensing a Virtual 2D FX Table
After the first experiment didn’t go as planned, I decided to create a more ‘traditional’ way to interact with the Myo armband in the DJ App by using a virtual 2D FX table.
It feels like having a large 40”, slightly tilted rectangular table in front of you. To make the feeling even more “precise”, I had to cheat a little bit due the human inaccuracies mentioned before. The field has 20% additional padding to enable the selected audio effect without changing its parameters. Interestingly enough, I couldn’t sense this padding at all, it just worked naturally for my brain! I think little tricks like this are important for most accelerometer/gyro based input devices to provide the best feel.
This Virtual 2D FX Table proved to be spectacular for the audience which is very important since entertaining the crowd is the number one priority as a DJ. This method also ended up being robust enough for the DJ booth and doesn’t require constant re-calibration.
The integration with my DJ Player app is complete and I’m confident in both the results and the compatibility of the two in terms of its usefulness for professional DJs.
The app receives the Myo armband’s acceleration, orientation, and pose notifications through delegate methods, very similar to the HelloMYO example projects. Please note, the yaw value is bound to north so if you want a relative horizontal position, you should always calculate it from the difference of to a previous event’s yaw value. In DJ Player’s case, I stored this “reference” yaw when the user enables Myo audio FX control with a gesture.
To enable Myo audio FX control, DJs should point to the air and do a finger spread pose. They’ll receive a medium length vibration feedback if it was successfully detected, and the Virtual 2D FX table “appears” in the front of them, center to the activation’s horizontal position. Disabling audio FX is similar, but they need to point to the ground and do a fingers spread pose. A short vibration confirms that the virtual table is gone. Evidently, the vibration is a very useful feature!
Another gesture is used to enable the crossfader control mode. To do so the DJ will just need to point to the horizon, twist their arm to the right and make a fist. I created this gesture recognition by keeping a short history of the roll values, and comparing to the 14th value in the past. If the change is big and a fist pose is detected within 1 second, it triggers the event and sends a long vibration.
Disabling the crossfader control mode is also very simple – just point to the ground, and a short vibration confirms that the crossfader is not controlled anymore.
As you can see, these gestures were made with the combination of value history, timing, and poses. They are also designed to be foolproof for the DJ booth, and always relative to the activation’s position.
Latency and CPU Usage
All the magic you see in the video above is based on uninterrupted, smooth 50 fps input, so being ready to process the signal input with low latency and zero jitter is paramount. Any interruption or minor stall in arm position input directly affects audio output, where even the non-trained ear can easily hear any small “hiccups”.
So, how can you achieve this? With extremely low CPU usage. The “busy-ness” of your CPU may delay the processing of the input events for a few milliseconds, and that can add up to a bad jitter. This is one of the reasons we developed Superpowered. With the Superpowered SDK, you can achieve correct scheduling priorities, the lowest CPU usage for audio, and radically reduce jitter.
The Myo armband and Superpowered open a new world for audio app creators, a world where APIs are not only high performance, but also very easy to use at the same time.
It’s really great to see that the Myo armband will start shipping this year and following launch I’m sure that Myo + DJ Player (running on Superpowered, of course) will find a permanent place in the DJ gear world.