This post is a part of a series* featuring members of our developer communit*y. Stay tuned to our developer blog for more featured developers and other news about developing for the Myo armband!
William, Nick, and Chaitya, freshmen from Georgia Tech, attended SwampHacks 2016 and had the opportunity to hack with the Myo armband. William tells us about his team's application that uses Myo to provide a unique interface that could be used by visually impaired persons.
My name is William and together with my hackathon team, we built Iris, an Android app that enables visually impaired persons to interpret their surroundings with Myo’s built-in hand gestures. What really enables the Myo to be the centerpiece for the app is that it allows blind people to easily interface with the app without needing to look at the screen.
We came across the idea of Iris while exploring the unique ability of the Clarifai API to analyze images and parse them for descriptive tags. We broke Clarifai down into an equation: Picture + Clarifai = Tags. So why not flip that equation on its head and use tags to "create" a picture? Through the use of descriptive tags, Iris aurally illustrates a scene in the minds of the blind. After all, a picture is worth a thousand words.
We challenged ourselves to create an app that could be easily used by the visually impaired. By simply opening the app and sliding your device into your breast pocket with the camera peeking out, you’re ready to go. Myo keeps our hands free, allowing us to interact with the app using hand gestures.
We faced some challenges because of the many asynchronous threads required to make Iris function. The camera, Clarifai API, text to speech, and loading screen all run on separate threads, making it extremely difficult to implement them all at once. When working with the camera, we had to get around using the built-in camera software by using Android’s hardware camera. The issue with this is that it is asynchronous, and we also have to handle raw image data, using byte streams and byte arrays. However, our team is fairly new to asynchronous programming, and working on separate threads, so this entire project was a great challenge. We worked on having the calls to APIs asynchronously, as the APIs’ documentation suggested, however with only 6 hours left, we had an application that broke every time the async tasks finished in the wrong order. After a significant amount of refactoring, we were able to organize the application in a more efficient manner, making it fully functional and capable of helping those who need it.
Myo provided us with the utility that we needed for an app that was intended to help the visually impaired. From getting the software integrated into our app, to actually using the armband to operate the app, Myo was pleasant and easy to work with, allowing us to comfortably finish our app within the time 24-hour time period that SwampHacks allotted.
While most of what we had planned was achieved within the course of the hackathon, there are a few bugs and functionalities that we want to iron out and add to any final iteration that we would release publicly. Hopefully if we can accomplish all of that we can get to releasing it on the Myo Market.