A computer’s accessibility features make it easier to see, hear and use, while allowing for a custom fit, so to speak. Microsoft’s Accessibility Business Unit (ABU) is the go-to department when it comes to making the company’s products more accessible. It also ensures that Windows remains a platform that other companies use to develop innovative assistive technologies. Norm Hodne, Windows’ Accessibility Lead for Microsoft, recently gave Chet Cooper the 411 on the ABU.
Chet Cooper: Let’s start with a really tough question…
Cooper: If a train left Boston going 50 miles an hour…
Cooper: Okay, this one’s pretty tough, too: Why is accessibility such a challenge?
Hodne: I think it’s because there are various user interfaces [UI] that have different ways of presenting information to the person sitting at his or her computer. UIs are all so varied and use different controls, there has to be a way to standardize assistive technology programs, which has been hard.
So we try to use programming to work around these constraints, sometimes using an interface: something between two applications that help them talk to each other. Each step requires a combination of programming techniques, which adds to the complexity of the task.
Cooper: People have been trying to deal with accessibility issues for years. I’m surprised that it continues to be such a hurdle. Is it because certain people embrace the challenges on the front end, while others only try to adapt a program after it’s already been designed?
Hodne: I think that is a significant factor, especially if you look at websites. In the past, I don’t think a lot of web developers thought much about accessibility. Fortunately, more and more are starting to have the foresight to develop different types of tools that have accessibility built into them. If you use that tool to generate a website, it should be a lot easier for those who need it to use it.
Cooper: How long have you been involved in the accessibility realm?
Hodne: About two and a half years. I was in the Accessibility Technology group here at Microsoft when we were developing Vista, and now we’re working on the next version of Windows.
Cooper: When will that come out?
Hodne: We haven’t determined that yet.
Cooper: Do you think that products coming out of Microsoft these days will be more accessible, even in their default mode?
Hodne: Yes. When we released Vista, we asked throughout the company what we needed to do to make sure that we had the right focus, which included the accessibility component. For instance, there are more than 300 specialty assistive technology products available for Windows computers such as screen readers, screen magnifiers and on-screen keyboards, which offer innovative solutions to help people who have impairments and disabilities to use their desktops and laptops effectively. We work closely with members of the Microsoft Assistive Technology Vendor Program to ensure that a wide range of assistive technology products are compatible with our major Microsoft product releases.
For the Windows division, our goal is to make our operating system the most accessible it can be for every single edition. We always seek to improve. For Vista, we implemented an application programming interface, which is a substantial step forward from the Microsoft Active Accessibility (MSAA). Though the latter is still around and will be for a long time, it didn’t have the capability to adequately expose these new user interfaces, such as Silverlight or Windows Presentation Foundation (WPF) applications. WPF, for example, is used for the New York Times reader, which is a Windows application that allows a user to download full editions of the publication onto a desktop computer, where they can be read offline in a familiar, yet customizable layout. So we had to revisit how we thought about accessibility APIs so that they could be flexible enough to deal with these new user interfaces.
What’s great about user interface automation is that it can automatically recognize a new custom control. Internally, it’ll respond like Oh, I’ve never seen this control before, but because of how I’ve been programmed, I can figure out how I should interact with it and make it work. Therefore, different applications should automatically be able to adapt to a control that they’ve never “seen” before because they’re using the automation. I think that’s a huge step forward in being able to make it easier for end users who won’t always have to wait for that next version of Window-Eyes or whatever their favorite assistive technology tool is to be able to access the new applications. We’re really excited about that.
Plus, with user interface automation, we’re getting 10 times the performance that we were getting from Microsoft Active Accessibility. So when we do a standard screen-scraping on an application, it’s 10 times faster than what we’re seeing right now, which should dramatically improve the performance for end users.
Cooper: Could you explain that a bit more?
Hodne: Let’s say you have a Microsoft Word application. The screen reader is going to go through and read the entire page and figure out what’s on this page, what type of buttons, what type of text, etc. That process of reading through the page is called “screen scraping.” Today we’re seeing that the user interface automation is 10 times faster than with Microsoft Accessibility API. We anticipate a huge benefit to end users, once this gets widely adapted within the industry.
Cooper: What other concerns did you consider, beyond the speed of the system?
Hodne: As I was saying earlier, if you go into Microsoft Word, you start to see all these compound controls. It might be a list box that has a combination of different functionality than what you would have seen in the past. For instance, maybe you just saw a list box in the past, but now it’s got a button that has a drop-down to it that provides additional information, so you can filter out what you don’t want in your list. In the old system, all of those new controls were hard to describe. It’s now possible to describe those in the new system in a fairly easy and straightforward way.
Cooper: Going back to websites, in the old system you had to be concerned with the way you built tables for screen readers to know if they were reading html code and such. Is this affected as well?
Hodne: In the case of web pages, our Internet Explorer group has been working through the World Wide Web Consortium (W3C) and their Web Accessible Rich Internet Applications to find what that standard would be for web developers. We’ve taken that standard and said, “OK, what would we need to do to be able to implement that web standard through UI Automation?” We’ve also published that information on the web and provided it to all the assistive technology vendors, the Accessibility-Interoperability Alliance (AIA) members and the like, to let them know that this is how you would use UI Automation to be able to implement an accessible web page, or to be able to read an accessible web page that’s using the W3C standard.
No solution is perfect, but what we’re doing is taking this new W3C standard and seeing what we can do in the future to be able to support all the different functionality that they’re calling out in their standard. We talked to IBM about it, and we looked at the IAccessible 2 specification that they’re working with through the Linux Foundation. There are issues there as well. So we have a really good understanding of what those issues are, and we’re working with the Linux Foundation and working again with the W3C through the Internet Explorer people to resolve those issues.
That’s one of the reasons for the AIA. We set it up because we were having these conversations individually with assistive technology vendors such as IBM and Sun Microsystems. Or we’d call GW Micro, an adaptive technology vendor of products for the visually impaired, and say, “Hey, we’re looking at this, and we see that there are some issues here. How would you want to implement this in your products?” Then we’d talk to IBM and say, “Hey, we’re looking at this specification, and we see that there are issues. How do you think we should work around these issues?” Everyone had input and different preferences on the right way to go.
We decided that we really needed to get an industry group together that could deal with UI Automation issues, and work on harmonizing the specifications to implement them correctly across platforms, so it didn’t matter whether it was Windows or Apple Macintosh or Linux. We wanted to look at all the different specifications in order to find the best way to create accessible applications. That’s how AIA originally got started.
Chet Cooper: What’s the difference between ATIA and AIA?
Hodne: Well, they have similar membership, but the AIA is a technologically-focused organization where we’re really getting down to the nitty-gritty of developer specifications. That’s what we focus on. We’re not getting into policy or any other things related to the assistive technology world, which is really some of the other venues that ATIA gets into. I personally am not part of that decision-making process or that organization, but Rob Sinclair is because of his role as director of the Corporate Accessibility Group at Microsoft.
Cooper: Are you familiar with Gary Moulton’s work.
Hodne: Gary has been focused on usability and accessibility as it relates to aging. He holds regular conferences internally here at Microsoft, trying to raise the awareness of the impact of aging through all the different product groups and in Microsoft research. It’s helped us realize that accessibility must go beyond addressing the needs of those with severe disabilities. It’s anybody with mild and/or temporary disabilities as well, such as arthritis or a shoulder injury. In fact, when we were getting ready to ship Vista, Rob had his shoulder in a sling for six weeks and needed assistive technology to do his job.
Cooper: I’m using bigger font sizes these days, which I’m sure is due to aging.
Hodne: Exactly. We want to make sure that everything we do moves our technology forward to help people use our systems. So when we think about how we’re going to develop our applications, we think now in terms of multi-modal use. Can you use our product with a pen on a tablet PC? Can you use speech with it? Can you use a keyboard with it? Can you use a mouse? Can you use assistive technology devices?
You’ll see that we’re going to be stepping up our documentation for developers, testing tools for them, increasing our blogs and information for them and porting labs. You’ll probably see some additional direct investments into the community to enable them to adapt or adopt UI Automation to their products sooner rather than later. We understand that there are some real key issues that are keeping people from investing, and a lot of it has to do with the vendors out there. Just like with us, if you have something that’s working pretty well, it comes down to a matter of do you want to invest to increase the performance of what you already have, or do you want to put in new features? There’s always that push-pull. “Do I really want to take the time to implement a new API when the one I’m currently using seems to work okay?” We’re trying to get the momentum going so that the industry will adapt and make strategic investments to innovate faster.
Cooper: Can you give me an example?
Hodne: For one we’re going to be releasing a couple of new testing tools in the near future, which we hope will help people more quickly understand what the accessibility issues are and to help them solve problems. So that’s an internal investment we’ve made.
I finished a document recently that starts to spell out some of the additional steps that we could take to innovate more quickly, but I can’t give you the details because they have yet to be approved.
Cooper: One of the things that I see industry-wide is the error message. It pops up on the screen, but it’s not a click-and-play. Meaning, you can’t click on the message and figure out how to correct the problem simply. The message just describes what’s wrong, and sometimes it’s tough to even figure out how to fix it.
Hodne: I know what you mean. For these testing tools, they are geared towards developers or testers of applications. But even with these tools, we’ve created some additional documentation which helps the developer or tester understand the origin of a particular error so it makes it easier for them to understand. Even for developers, if they’re not extremely familiar with the technology, they may also be confused by a simple error message. So we’re trying to be a little bit more informative about what that error really means and what’s needed to correct it. I totally agree with you there.
Cooper: I love the kind of program that explains why something is wrong, what it’s going to do to change it, and then you click and it’s corrected. If you could program that into everything you do, everybody would love you guys.
Hodne: Yeah, I’m really excited about the way that specifications are developing in W3C, the Linux Foundation, and with the AIA, because I think that we’re getting a much clearer understanding throughout the industry about how we can simplify the display of information through assistive technology tools. Instead of just starting at the top of a web page and reading through all of the controls and all of the information that’s available there, we’re asking if we can do that in a more intelligent way? That obviously requires a partnership between the people developing the web pages, as well as the assistive technologies that enable it. I think, over the next few years, people who use these tools will have better experiences because of these specifications and standards that are emerging now.
A VIEW FROM THE INSIDE
We caught up with Loren Mikola, Microsoft Corp.’s Disability Inclusion Program Manager and asked him a couple of questions about what’s doing at the company’s Redmond, WA, headquarters with regards to employees (and potential future employees) with disabilities.
Chet Cooper: What do you consider best practices around disability for Microsoft?
Microsoft has a strong diversity and inclusion program. There are a number of best practices including education and awareness programs such as disability-related training for recruiters, managers and fellow team members. We also have a comprehensive accommodations program with a centralized budget, and employee affinity groups and networks focused on disability. We provide interpretation and captioning services for our employees who are deaf and hard-of-hearing, and we build orientation and mobility services for employees who are blind or visually impaired. American Sign Language classes are offered to any interested employees.
Cooper: What Microsoft programs do you consider strengths for the company in terms of outreach for people with disabilities?
Microsoft has a strong youth outreach program that exposes young people to technology and related careers. We ensure that students with disabilities are included in those programs. Some students have gone on to become interns and regular hires. Microsoft also is an active member of the Washington State Business Leadership Network, an organization that educates businesses on the advantages of seeking and hiring persons with disabilities. Finally, Microsoft participates in various outreach events to educate college students and graduates such as those sponsored by Career Opportunities for Students with Disabilities.