This post is a part of a series featuring members of our developer community. Stay tuned to our developer blog for more featured developers and other news about developing for the Myo armband!
Run off in all directions and let the music work for you. Using Myo gyroscope data, a hackathon team worked on an application to determine whether the user is standing, walking or running to select tracks to play from a user’s playlist. We sat down with the team to hear more about their project.
Tell us about your hackathon team.
We’re students from the University of Waterloo in Ontario, Canada. Asad is a mechanical engineering student and Alya, D.Z., and I (Steven) are computer science students. We met at Waterloo Hacks 2016 and quickly realized that we had same interest in wearable technologies.
Why did you choose to hack with Myo?
We were looking for a device that could provide accelerometer and gyroscope data, so we would be able to extract information about the user’s gait. Among all the wearable choices provided in the hackathon, we found that the performance of Myo was the most stable and accurate. Myo also provides excellent SDK and documentation, so that getting started with our project was fairly straight forward.
iOS app that transmits Myo data to the web server
What was your inspiration behind building MyoMuse?
When I used to go jogging, I always liked to listen to music. I found that the rhythm of the song playing affected the way I ran, and even the duration with which I could run. The problem with listening to songs while running is that sometimes the song doesn’t match your pace, and you get side-tracked trying to change it. Our hack solves this problem by matching up songs with the speed of the user’s motion, based on the beats-per-minute data of the user’s playlist.
Ideally, this would help users exercise by streamlining the song-selection process, and promote a healthier lifestyle by improving the musical aspect of working out.
What challenges did you face? What are you most proud of?
One of the biggest challenges was to convert the raw gyroscope data in x, y, z axes to beats per minute in music. We discussed how gyroscope works in general, and conducted multiple experiments to see if we could find a pattern. By plotting graphs of z-axis data against time, we found the relationship between the peaks and troughs and the user’s movement. We then developed an algorithm to calculate the user’s gait information. We also had to identify and ignore the noise in the data, so that the music the app plays eventually doesn’t change too frequently.
What do you plan to do next?
Our data analysis for detecting the user’s gait and selecting songs was a bit crude due to time constraints, and we feel we could make it a lot more accurate and well calibrated. One approach would be to use the Myo’s gesture detection to let the user “train” the app unobtrusively, allowing them to skip songs they don’t want to hear at their current running speed with a simple hand gesture. This data could also be used to help shape song suggestions for other users.
What do you envision for the future of wearable tech?
I always see wearable technology as the next big thing in the industry. Wearing a piece of technology brings user interaction to a new level, since when you wear it, you directly feel it and interact with it. Some wearable technologies focus on outputting information, such as a smartwatch that displays notifications in time, while other technologies focus on inputting information, such as the Myo armband which analyzes and responds to user motion. I believe that in the future, wearable technologies are no longer expensive accessories, but important devices that are more capable, more affordable and that everyone needs. Wearing a piece of technology in the future would be as normal as carrying a smartphone today, except it would be smaller in size and bigger in power.
Webpage that uses Spotify’s API to select tracks for a running playlist.