websights

How Inkling Improves Education Accessibility

Inkling is a very young product, but with just five months under its belt, it has grown up quickly. Our content continues to expand, making it ever more useful to students. And while we’ve got a lot of work to do on both content and the software itself, we’ve recently made some significant progress in one area that’s especially important to me: accessibility.

Those with impaired vision or hearing are often inadvertently excluded from the benefits of new technologies and innovations. With the book, blind individuals have had the option to use braille editions. Some with limited vision have been able to use large print editions. Still others have listened to recordings of the book. For videos, those who are hard of hearing have had the benefit of closed captioning. But as we move to ever-more interactive and dynamic learning content, how do we ensure that everyone has an opportunity to benefit from these innovations?

While it’s impossible to ensure perfect parity between those who can see and those who can’t, we are working hard to narrow the gap. Nowhere is this more important than in education, where the alternative to doing so is to deny a subset of people access to modern learning methods like those supported by Inkling. The latest version includes our first major effort to enable access for those who have impairments, and while it’s not yet perfect, it’s the first of many such improvements we plan to introduce.

The process of enabling assistive access is easy in some respects, but very difficult in others. When you have an XML document like an Inkling title, text to speech technologies make it relatively straightforward to provide access to the visually impaired. As such, Inkling supports full text-to-speech in every reading card in the platform using VoiceOver on the iPad. But how do we help a user find embedded glossary Poptips in that reader? Or “see” the images? How do we help a user understand that hints and feedback are available for a multiple choice question? And what good is any of this if a blind user can’t buy content inside our store? We’ve attempted to tackle each of these cases, and many more, in the latest version of our software.

Our first goal was to ensure that everything a user could do in the application could also be done by someone with a visual impairment. That means we had to insert user interface elements that the iPad’s text-to-speech engine could hook onto. It means we’ve had to design the Inkling Store to detect VoiceOver and display the information differently to allow for full text-to-speech access. Users with visual impairments can access their contacts, purchase content, search the book and even jump from figure to figure within the title, all without seeing the display.

One very basic example of work done to accommodate the nonlinear structure of Inkling content can be seen when the iPad is reading through text. Glossary Poptips are denoted by a small icon for the typical user. When tapped, it presents the definition on top of the text. For a blind user, however, we had to engineer the system to indicate the availability these terms as it proceeded through the text. When a blind user listens to content being read, these glossary terms are followed by a pause and the words, “defined term, link” before Inkling continues reading. When the user wants to hear the definition, he simply swipes right-to-left, taps twice and hears the definition. He can then double tap to dismiss the definition and swipe down with two fingers to continue. It’s remarkable how quickly our blind users are able to pick up on these gestures and proceed quickly through the content.

People often forget that many of those with visual impairments are not, in fact, blind. There are varying degrees of ability in the blind community. So while we’ve designed Inkling as best we could to support those who see little or nothing of the device screen, we’ve also accommodated those with low vision. For example, when navigating through content, colored indicators in Outline View show your location, and a single tap reads the text. A user can use gestures that interact with the onscreen elements, but augment the visual cues with spoken guidance. We’ve even avoided using color as the sole indicator of a piece of information inside Inkling, because it’s difficult for colorblind people to distinguish between some colors. So green “correct” indicators always also bear a check mark.

There are many questions we’ve had to answer in the course of improving the accessibility of Inkling. When you’re reading through a title, for example, at what point do you interrupt the reading to call out the figures that appear on the right? When is it appropriate to describe something to a user, and when does it become tedious? Do users want to know whether they’ve traversed a link before? There’s no “right” answer to such questions.

We know that what we’ve done isn’t perfect. We have some aspects of the application that require further accessibility augmentation, and we’re aware of some bugs in the way that text is read to the user. However, we’ll continue to work to improve Inkling over time to ensure that it’s accessible to everyone, just as education itself should be. Section 508 compliance is important, but we’re doing this because equal opportunity is simply the right thing to do.

If you’re blind, deaf or otherwise putting our accessibility efforts to use, please send us your feedback. We want to improve Inkling to make it as revolutionary to your education as it will be to everyone else’s.