UPDATE 1/24/2015: A poster based on our Machine Learning engine for Difficulty Level Classification has been accepted to AAAI 2015 @ UT Austin. Congrats to our coauthor Ankit Tandon for his first author poster acceptance!
UPDATE 5/1/2014: The work has been published in the CSUN Journal on Technology & Persons with Disabilities. The journal has been released!
UPDATE 3/24/2014: Lunar Tabs was presented at the 29th Annual International Technology and Persons with Disabilities Conference at CSUN 2014. The slides for the conference talk are available here.
Conference Papers related to project:
Tandon, Prateek, Stanley Lam, and Ankit Tandon. “Lunar Tabs: An Intelligent Screen Reader Friendly Guitar Tab Reader.” Proceedings of Journal of Technology and Persons with Disabilities. 2014.
Tandon, Prateek, and Ankit Tandon. “Personalized Difficulty Level Classification and Feature Analysis of Guitar Tablature.” (Working Paper in Submission).Posters related to project:
Latest Build:
Download Lunar Tabs (Android Version)
Download Lunar Tabs (Desktop Version, usable with VoiceOver and NVDA)
Over 50 million people worldwide play the guitar. These days you can download a guitar tab for virtually any song you want to learn online. Guitar tab databases like ultimate-guitar.com and 911tabs.com house tremendous databases of guitar tabs online in structured formats such as guitar pro (*.gpx) or power tab (*.ptb). If you want to learn a song, you can simply select from the millions of user-generated tabs online. However for the 285 million people worldwide with low vision, the existence of these tabs is of limited use since most guitar tab readers aren’t as accessible as they could be. Instead many blind users try to learn guitar purely by ear, but the process is time-consuming and frustrating.
Lunar Tabs is a guitar tab reader built from the ground up to be designed with accessibility in mind. Lunar Tabs takes as input an electronic guitar tab in a well-structured format and generates a sequence of text instructions for playing the piece that are then fed into a user’s screen reader. A person who is blind could use Lunar Tabs to learn any song they wanted by harnessing the giant tab libraries online.
Accessible User Interface
One of key challenges of designing the application is creating an accessible user interface that is usable by someone that is blind or has low vision. For persons who are blind, mobile devices have a setting called “Explore by Touch” or “Talkback.”
When the mode is enabled, the mobile device operates in a two touch mode: a single touch on a component (such as a button) simply reads the label, and a second touch on the component actually activates the component. In this way, a user who is blind can swipe around a touch screen to explore what they can do in the application before using deciding which of its features to use.
The challenge that users who are blind face is that Explore by Touch can sometimes be too slow since they still have to swipe around. To remedy this, layouts for low vision applications often place the key components / buttons on the edges or bottom of the screen. This way, a user who is blind has to swipe less to understand what functionalities the application has. Our layout is optimized with this consideration in mind.
The application supports many of the generic things one would want out of a guitar tab reader. A user can load a tab file into the program, scroll through it measure by measure (or N measures by N measures), scroll through instructions, quick jump to various sections, play a sample of the currently viewed musical segment at different playback speeds, and play a sample of particular notes or chords.
However, the key difference between Lunar Tabs and other guitar tab readers is that Lunar Tabs presents music in a screen-reader friendly format. Beats in a guitar tab file are converted to text instructions that can be read aloud via Text to Speech as the user navigates the piece. The text instructions are optimized to provide lots of information about the tab without being verbose. No one likes a screen-reader that talks too much, though we still need to provide instruction sufficient to play the piece.
Instructions can be generated in two modes – “String/Fret” mode and “Note/Chord” mode.
In String/Fret mode, the hand configuration of a playable event (i.e. note / chord) is presented in terms of the strings and frets to be played. Thus, the screen reader might indicate playing “-,-,-,0,-,-” for playing a “G” (for a generic 6-string guitar). The instruction generator also conveys note duration information such as “dotted eighth note.”
Many intermediate to advanced guitar players, however, are conversant in higher-level elements of music such as chord progressions, keys, and note names. For these types of players, we also built a “Note/Chord” mode. This mode uses a database of chords and note names matched to hand configurations to generate the actual name of the chord or note signified by a particular hand configuration. Thus, if a “G major” chord is coming up in the piece, instead of providing the verbose hand placement description, this mode succinctly tells the user about the “G Major” chord. This does rely on the user knowing the hand configuration for this chord, and thus is a feature targeted at more advanced users.
Intelligent Segmentation Features
A person without visual impairments can scan the entire musical piece at once, identifying the structural form and key repeating melodies, and classify which parts of the piece look more difficult than others. For a user with visual impairments, such capabilities must be facilitated through the instruction presentation. We are experimenting with algorithms to intelligently segment a music piece for presentation with a screen reader.
One feature that can be useful to someone learning guitar is to understand what are the unique aspects of the piece. It turns out that lots of popular songs (and music in general) is highly compressible with respect to repetition in the piece. There are two types of repetition in a musical piece: explicitly defined repetition (by virtue of repeat signs) and implicitly defined repetition that exists in the piece but is not specified by repeat signs. A nice feature to have is to figure out where the repetition occurs, and present unique measures or segments only once in generating instruction.
For instance, the popular song Breakfast at Tiffany’s contains 118 measures of acoustic guitar instruction. However, it only contains 10 unique measures. The time to learn those 10 measures and how to string them together is much less than the time to learn 118 measures naively. Our segmentation algorithm allows detection of systematic repetition in the music piece to help minimize the time required for the user to learn the guitar tab. Learning once the key aspects that repeat in the tab saves the user much time then learning them multiple times.
Generic repetition in a musical string can be identified efficiently using Chrochemore’s algorithm which can find all repetitions in a musical string in O(nlogn) time. Segmenting repetition according to a cost function on n-grams can be achieved via the Weighted Interval Scheduling dynamic programming algorithm in O(nlogn). Local Gestalt boundaries can be optimized using Local Boundary Detection Model (LBDM). Thus segmenting a piece according to a cost function can be accomplished quite efficiently by creative casting of the problem into computational problems with known efficient solutions.
Another feature is to segment the piece by markers specified by the artist (such as bridge, verse, chorus, etc). If this metadata exists in the tab file, the segmentation can use this information. We are interested in helping identify these aspects using music theory constraints plugged into our optimization in the case the metadata doesn’t exist. However, we don’t know enough about what form these music theory constraints should take yet. If you are a music expert, we’re interested in learning from you. Please help us!
Using the application in Hands-Free Modes
A key challenge facing many guitarists (not just those who are blind) is using music applications in a hands-free mode. When you have an instrument in your hands, it’s difficult to press buttons on your tablet or smartphone. Even if you can, it certainly is disruptive to your workflow. We have prototyped three unique hands-free modes that can help anyone learn a guitar piece that is not work-flow disruptive: voice actions, stomp mode, and midi following.
Voice Actions
Lunar Tabs supports voice actions to automate the GUI. There are two types of speech recognition: continuous and noncontinuous. In noncontinuous speech recognition, a voice dialog box appears for a couple seconds where the user can input speech. Then speech recognition stops. In continuous speech recognition, the mobile device is always listening for input. Most Android developers are aware of the quite excellent speech APIs provided on the Android platform for non-continuous speech recognition. Google Now introduced continuous speech recognition capability into Android devices, though the standard API documentation did not advertise this. What many Android developers are still unaware of is there is a way to utilize continuous speech recognition in your applications using a couple Android Intent commands. Android DOES in fact provide API support for continuous speech recognition for developers.
A user can say things like “next, back, up, or down” in Lunar Tabs to navigate through measures and instructions in the piece. They can also say “play” to play the current selected section or “sample” to hear a midi synthesis of the notes in the currently selected chord. They can say “toggle” to toggle the instruction mode setting.
The use of voice actions allows the user to use the application without having to press buttons. However, it can still be slow and time-consuming if you want to get through lots of instructions fast. That is why we have…
Stomp Mode
In stomp mode, a user automates Lunar Tabs with stomping their feet. When a user wants to go to the next playing instruction, they just stomp near their Android device. The near by vibration due to the foot stomp is picked up the device’s accelerometer and registers as a motion spike. Our application runs a spike detector to identify when such foot stomps occur near the phone.
Recommended use is on a carpet or other sufficiently springy surface that allows near by vibration to propagate. One can also place their Android device on a towel or pillow, and that seems to work well.
This is a rather efficient way to automate scrolling through instructions. Of course we are cognizant of the risk of a user stomping on their phone and thus allow calibrating the sensitivity. While we do support using an external pedal, one of the key advantages of Lunar Tabs is that users do not need to purchase an external pedal to navigate through their music piece hands-free. The already existing accelerometer on the phone can emulate the use of an external pedal.
Midi Following
The Midi Following feature in Lunar Tabs allows the application to follow the user as they play chords. If the guitar is equipped with a Midi controller (such as Fishman Triple Play or Sonnus) or is a Midi guitar, the USB receiver can be plugged into the Android device using an On-the-Go (OTG) cable.
Using Midi drivers for Android, JFugue for midi event analysis, and a custom Midi processing algorithm, I was able to put together a chord tracker that tracks the person as they play the piece. This is a fairly useful mechanism to using Lunar Tabs in a hands-free fashion. There’s more work to be done, but I was actually surprised by how easy it was to do to get this kind of functionality running with a hexaphonic pickup.
Conclusion
Lunar Tabs has the capability to empower lots of people to be able to learn guitar. Its accessibility features allow a person who is blind or has low vision to access musical instruction in a screen-reader friendly fashion. Its intelligent segmentation and hands-free features also help improve accessibility of the application. However, many of these features may be useful for not just persons with disabilities but users at large. Hopefully Lunar Tabs can improve the guitar learning experience for everyone.
The Project:Possibility team is currently in the user testing phase for the application. A paper on Lunar Tabs has been accepted for presentation and publication at the CSUN Accessibility Conference 2014. Look for the talk there! There are, obviously, plans to add much more Robotics and AI into this application. There are a lot of prototypes for functionality that has not yet made into the build. Stay tuned for updates.
I would also like to point out that Lunar Tabs is an open-source project so anyone can develop for it and add features. The application is free as is the code.
Latest Build:
Download Lunar Tabs (Android Version)
Download Lunar Tabs (Desktop Version, usable with VoiceOver and NVDA)
Nice post. Good project and I wish you all the best with it.