The Myo armband is a very powerful device, enabling its wearer to do many different tasks using just their hands. But, it is imperative to consider the interface between the user and the computer when designing applications, especially when there is no screen involved in the interaction. Hi, I’m Mark DiFranco, a Software Engineer at Thalmic Labs. Today I’m going to outline design patterns to make a physical control experience more pleasant for users.

Banner-Photo-1

I’ll use our in-house Myo-Sphero application as an example. Sphero is a Bluetooth enabled remote controlled sphere. You can control its movement by specifying a direction of travel and a speed. We were able to provide an intuitive and fun interface to control Sphero using the Myo device’s orientation – broken down as yaw, pitch, and roll – and hand gesture detection, but not without some experimentation and stumbling along the way!

Mark blog Diagrams

To help break the experience down into smaller chunks, I’m going to be using the basic principles used by Brian Boyko in his GUI review of Windows 8. He breaks the user experience into 4 basic principles:* Control, *Continuity, Context, and Conveyance.

To start, let’s look at Control. The user should always feel in control when using the interface. When a user performs an action, it should provide an expected result, and when a user does not perform an action, no change should occur. The first version of the Myo-Sphero application was unfortunately not very good in terms of control. The app mapped the user’s arm’s pitch and yaw to the Sphero’s speed and direction, respectively. The user was always in control of the Sphero, so there was no way to not control it. This was not a very enjoyable experience, and would make the user feel “trapped”.

Mark blog Diagrams

To solve this issue, we decided to make the control more explicit. We changed the controls to have the user hold a fist in order to move Sphero. This way, they could rest their arm when they didn’t want to move it. Next, we implemented a control zone consisting of a range of angles in which commands took effect. These two changes eliminated accidental input.

Early versions of the Sphero app would simply scale down the roll, pitch or yaw to a value between -1.0 and 1.0, and use the linear value as an input. By squaring the input before sending it to the Sphero, its response became softer for subtle movements and more responsive for larger movements. Now, the user feels truly in control.

Mark blog Diagrams

Continuity means that similar actions should yield similar results. At this point, the Sphero’s control scheme used the arm’s roll and pitch to determine speed and direction. Roll would move the Sphero left and right, while pitch would move it forwards and backwards, allowing the user to interpolate anywhere in between by combining the two. While this was an intuitive control scheme, it broke continuity.

The issue was that as soon as the user faced a different direction, the inputs no longer mapped to the Sphero properly. If the user turned around 180º, the controls were essentially mirrored. We solved this problem by adding the Myo armband’s yaw, which corresponds to the user’s heading, to the directional input for the Sphero. Thus, no matter which way the user was facing, the Sphero would move the same way from the user’s point of view.

Mark blog Diagrams

Context, in terms of interfaces, means that the user always knows “where” they are. This is difficult to convey when there is no screen to display content. Fortunately, Sphero and Myo do provide some ways of providing feedback. It is possible to change Sphero’s internal light to any color, and the Myo armband provides a vibration feedback system. We used these features to communicate the state of Sphero to the user. Different control schemes would take effect based on Sphero’s mode – calibration or driving – so it was important to convey to the user which mode they were in. We used distinct colors on the Sphero to indicate the current mode. Further, we pulsed the vibration on the Myo armband, similar to a heartbeat, whenever the user was actively moving the Sphero. This tactile feedback helped remind the user that they’re modifying something – in this case, the position of the Sphero in the physical world!

The last principle, Conveyance, means the user should always know where to go next when navigating an interface. On a screen, this might be achieved with components like tabs and “breadcrumbs”. Without such mechanisms, it’s important to make gestures as intuitive as possible when dealing with a physical interface. We had the user make a fist to control Sphero in order to mirror the feeling of grabbing an object to move it. Similarly, having the user spread their fingers would stop the Sphero, similar to the “stop” gesture one might make while crossing the street. Choosing intuitive gestures helps provide Conveyance to physical interfaces.

Physical interfaces provide new and interesting implementation challenges compared to traditional GUIs, but can achieve excellent experiences with a bit of planning. Do you have any exciting interface ideas for the Myo armband? Let us know in the comments.

Sphero and the Sphero logo are trademarks owned by Orbotix, Inc.

Newsletter

Enter your email address and get all latest content delivered to your inbox every now and then.