Microsoft’s Efforts in Accessibility, Part 1

Microsoft’s Efforts in Accessibility, Part 1

By Cheryl Heppner

September 2009

Editor: I have a real love-hate relationship with Microsoft. I love some of the stuff they do and hate some of the other stuff they do. Putting a lot of effort into accessibility is definitely in the former category.

Here’s Cheryl Heppner with her report on those efforts. This is part one of two parts.

~~~~~~~~~~~~~~~~~

Sean Hayes of Microsoft Corporation is known for his expertise in the area of developing captioning technologies for the internet. His focus is on making internet-delivered media accessible. He with Microsoft’s accessibility business unit, which has been a corporate function for three years and includes accessibility outreach and policy. Originally it was part of the Windows group, which still deals with the technology parts of accessibility.

Microsoft’s vision is to enable everyone to meet their full potential regardless of their challenges. One of the things they are passionate about is improving what they call the natural user interface. As we move into new worlds with computer applications, they seek to keep the entire spectrum of disability needs in the forefront so no one gets left behind.

Sean gave the example of sustaining an injury that renders your arms useless for a few months. You have to learn to adapt, or more importantly have the computer adapt to you. Corporate ethics are involved, and a desire to do the right thing, but it’s good business too. A study Microsoft commissioned in 2003 by Forrester Research found that 57% of the population surveyed could benefit from some accessibility features that were put into Windows.

Inclusive Innovation

Microsoft uses a term called “inclusive innovation” that goes beyond universal design. Sean sees universal design as a process but not necessarily an outcome. He showed approaches to what could be called a bicycle to illustrate that there is no one-size-fits-all or universal bicycle everyone can use.

One of the ways Microsoft addresses this is with a user needs matrix. When designing products, they develop scenarios with different people who might use the product, like stay at home moms with two children or someone running a small business. The accessibility group created a number of personas with various kinds of disabilities. They take all those personas and look at aspects of product design such as a text screen, determining whether there would be a barrier.

This approach, Sean said, can get creative juices flowing and be a powerful model. Product groups are encouraged to think this way in their designs to get a wide range of inclusion when the product is brought to market.

Silverlight’s Access Features

Silverlight is a plug-in that goes into an internet browser. It enables a rich application environment, and one of the things you can do with it is high quality audio and video streaming. Sean showed a Silverlight media player embedded in an HTML page. The first thing it did was display captions. It also could provide audio description and switchable sign language translation. The media player has buttons labeled with international symbols for closed caption, audio description and signing.

Something of particular interest for people with hearing loss is the ability to separate the audio foreground from the audio background. With the Silverlight player you could have separate volume controls for each, and if you are having trouble understanding speech you can mute or eliminate the background sounds.

An individual with visual impairment who can’t make use of a mouse can operate Silverlight from the keyboard. If you have hearing and vision loss, you can adjust the caption size and color or the background behind the captions.

TV vs. Internet

Moving on from traditional television broadcasts to the internet, Sean said, certain things can be done that TV can’t do. When you’re streaming on the internet, you can pause to get other information and then resume the video. You can also capture a full text transcript that gives both the audio description and the transcript of what is going on, or stop and go back to catch something you missed.

Sean demonstrated a template for a product called Expression Encoder. This is a tool for generating video experiences, which had its third release in July. You select your video and audio assets, encode them, and select how you’d like the interface to look from a menu of templates. Fill in the information and it will generate the whole Silverlight experience that you can then copy to your website, ready to go.

Demonstrating Microsoft Expression Encoder

Sean Hayes of Microsoft demonstrated the company’s Expression Encoder. Using an advertisement, he selected and began to fill in various templates.

With the caption file, he selected W3C time text. “As bad as you might think captions are in the US, in the rest of the world they’re all over the map,” he said. Some countries have no captioning at all. Others do it a completely different way than in the US. But the internet is an international medium so we can’t have specific technologies for captioning for each country. W3C has been working on captioning standards called “time text” and Sean is co-chair of that working group. The standard will cover how you author and exchange caption data. [NVRC Note: W3C is the World Wide Web Consortium — http://www.w3.org/].

Next Sean incorporated an ASL file. This was a video of someone who had watched the non-ASL file and provided continuous ASL translation. It was filmed against a green background so it could be overlaid on the video. Sean also added audio description, a timed text file with markers that direct when the audio description files are appropriate to play.

You can select from other choices for appearance, such as whether you want the video to play automatically when you go to a web site, which can be a bad choice because it can interfere with screen readers. Once the features are selected, you hit the encode button. This takes all the components you have selected and puts them together.

The template and all of the code is an open source project that you can get if you’d like to use it for your own projects.

It’s Easy – Why Isn’t It Everywhere?

All this technology looks so easy, so why isn’t all of the media on the internet accessible? Sean asked the question he knew would be on everyone’s mind, and his answer was that there are multiple formats and proprietary tools to deal with, particularly for generating captions. Captions usually go through an intermediate broadcast standard. As an example, a broadcast of the TV program “House” uses specific tools and delivery. If you buy or lease content from NBC or BBC, the accessibility data they give you will be in only some formats.

An additional hurdle is that there are many playback environments. You may be playing the video on your computer or on a handheld device. Or you may be using the TV and a set-top box or watching through a game machine like Xbox and a web browser.

“We can’t just say ‘let’s have CaptionMax build me a set of caption files and I’ll just run with that.'” Sean said. A request to NBC for a series comes with no guarantee you will get any of the caption data that was on TV. If a TV show is a live production, the caption files may not have been saved. There may be no record of which company did the captioning. Or it may be that captions came from the cable provider. The entity with IP (internet) rights for the program may not have the IP rights for the captions that were generated when it was shown on TV.

“We don’t know where that data is, and it’s expensive to recreate it,” Sean said. He also does not have the right under IP rules to do captioning; studios are upset if third parties provide captions for their material because there is an IP revenue stream in the captions.

Here’s Part Two

~~~~~

(c)2009 by Northern Virginia Resource Center for Deaf and Hard of Hearing Persons (NVRC), 3951 Pender Drive, Suite 130, Fairfax, VA 22030; www.nvrc.org. 703-352-9055 V, 703-352-9056 TTY, 703-352-9058 Fax. You do not need permission to share this information, but please be sure to credit NVRC.