How can accessibility bugs help us consider process improvements at a higher level?
The Narrator screen reader listing the headings available at the Sound page of the Settings app.

How can accessibility bugs help us consider process improvements at a higher level?

This article describes technical details relating to three accessibility bugs, and reflects on how those bugs can help us consider the higher level challenges which might be the root cause of the bugs.

 

Introduction

Over the last few weeks, I’ve been contacted by devs who had questions relating to accessibility bugs assigned to them. Once we’d discussed the bugs and figured out next steps for their resolution, I reflected on what high level challenges might exist that had led to those bugs existing. After all, how often is an accessibility bug really a symptom of a more fundamental issue?

This article describes the three accessibility bugs, along with considerations on related high-level challenges.

 

Bug 1: I’m not making all my customers aware of the headings in my UI

When semantic UI exposes information about the headings it presents, your customers using screen readers can navigate through the UI via the headings. This is a really important mode of navigation, as your customers can move rapidly from one heading to the next, until they find a heading of interest, and then proceed to navigate through the detailed content that follows the heading.

Your sighted customers aren’t forced to consume in detail all the content on a page before they reach the content they’re most interested in, so why force any of your customers to do so?

In the case of this particular bug, text in the UI was presented visually using a large font, but was not being exposed semantically as a heading. As such, sighted customers might assume that the large text was indicative of a heading, but customers using screen readers were blocked from learning of those intended headings.

This UI was built using the UWP XAML UI framework, and today that UI framework does make it straightforward to expose headings semantically. (This support was added to UWP XAML in Windows 10 build 17134.)

To have a TextBlock exposed semantically to screen readers as being a heading, use AutomationProperties.HeadingLevel. For example, for the main heading on a page:

<TextBlock 
    AutomationProperties.HeadingLevel="Level1" 
    Text="This is heading level 1" />


With the text now being exposed semantically as a heading, the Accessibility Insights for Windows tool (AI4W) can be pointed at the text to verify that the text is being exposed through the UI Automation (UIA) API as a heading.

The image below shows the AI4W tool reporting that text in a TextBlock is being exposed as a heading of level 2. Heading level values are listed at Heading Level Identifiers.


The Accessibility Insights for Windows tool reporting text as being exposed semantically as a heading of level 2.

Figure 1: The Accessibility Insights for Windows tool reporting text as being exposed semantically as a heading of level 2.


With the programmatic representation now verified, the Narrator screen reader can be pointed at the UI, and can list all the headings, and provide an efficient means to navigate through the headings.


Narrator’s “Search Headings” feature listing the headings being presented in a page.

Figure 2: Narrator’s “Search Headings” feature listing the headings being presented in a page.


High level consideration #1

So why did this bug exist in the dev’s UI? Did the feature spec that the dev was working with describe the heading levels associated with the text? Perhaps if the spec had included that information, the dev would had implemented the UI accordingly. And if the dev was not familiar with the concept or implementation of a semantic heading, he could have followed up when reviewing the requirements laid down in the spec. Either way, the bug might have been avoided.

Or perhaps the feature spec included details of the size of the font to be used for text, but no explicit details of the text’s heading level. Was it the role of the dev to map the font size to some heading level?

Whatever the cause of the bug, one possible consequence is that any dev dealing with a bug like this decides while attempting to fix the bug, what heading levels should be applied. If the dev does this, is the dev familiar with best practices around the use of headings, and will the levels chosen be in line with the use of headings in other, similar UI?

It’s not the role of the dev to be designing UI in response to them being assigned a bug. Rather it’s the role of whoever’s designing the UX to be designing it for all customers, regardless of how those customers interact with the UI.

So every time a dev fixes a bug like this, the team needs to discuss how the bug came about. If it was due to the feature spec that the dev was given not describing the heading levels of the text in the UI, the UX design processes should be updated to ensure the heading levels are included in future specs.


High level consideration #2

In the case of this particular bug, the UWP XAML UI framework made it straightforward for the dev to implement a fix. But what would have happened if the dev were working with a different type of UI?

Many bugs are logged because an accessibility principal has been violated, and the bug has impacted the customer’s ability to efficiently complete their tasks. And I’d say it’s appropriate for the bug to be logged, regardless of the type of UI being used. Your customer is impacted, and you need to be fully aware of the experiences you’re delivering.

In some cases, for example when building web UI or UWP XAML UI, exposing headings semantically is straightforward, and I’d say there’s no compelling reason not to help your customers through semantic headings. But for some other types of UI, the UI framework may not natively support semantic headings, and you and your team must decide whether it’s practical and reasonable for you to take action which might help your customers despite the current level of support in the UI framework you’re using.

On a related note, I wrote an article a while back around options for exposing headings semantically in Win32 and WPF apps, at An exploration into adding support for heading navigation in WPF and Win32 apps.

So an important consideration here is that your team needs to know what’s practical around fixing a particular accessibility bug, relative to the type of UI you’re using. You don’t want to lose a lot of time researching, reaching out to others, going back and forth with the person who logged the bug, if you’re ultimately going to be resolving your bug as “External” due to a constraint in the type of UI you’re using.


Bug 2: I’m not sharing important information with all of my customers

A dev contacted me after getting a bug related to his use of an ErrorProvider in his WinForms app. The ErrorProvider control is a fairly handy control for providing error-related information to some of your customers. But the ErrorProvider documentation points out that it must only be used in conjunction with other features that ensure that all of your customers have access to that error-related information.

From the documentation:

“The ErrorProvider component does not provide built-in support for accessibility clients. To make your application accessible when using this component, you must provide an additional, accessible feedback mechanism.”

 

So the question that came my way related to what additional error-related feedback might be appropriate here. Below are the three points that I suggested. 

Point 1

Consider showing the error text visually next to the text box. This means while the error applies, a sighted customer is efficiently made aware of the error, without having to rely on the mouse or touch to interact with a small hit target.

Also, present that visual error text near to the text box to which it applies, to increase the likelihood that if magnification is being used, some of the error text is in the magnified view when your customer is working at the text box. In fact to have the error text presented near the text box, helps with cognitive association for all sighted customers.

Point 2

While the error applies, set the accessible help property for the TextBox to be the error message. This means your customers using screen readers can access the information when interacting directly with the TextBox, and not have to explore nearby controls in order to access the information.

Point 3

All of your customers must be made aware of important changes that are happening in your UI at the time those changes are occurring. If only points 1 and 2 above were respected, your customer would tab away from the TextBox, the error would appear visually, and the error would be programmatically accessible through the TextBox, but the customer’s already moved away from all that. By default, nothing’s going to make the user aware of the error at that time. At some later time your customer may become aware of it, but it would be so much more efficient for them if they were informed of the error at the time it happened.

.NET 4.8 makes it straightforward for app builders to achieve this, by setting the LiveSetting property on the Label, as shown below.

labelError.LiveSetting = AutomationLiveSetting.Assertive;


By doing this, when the text on the Label changes, a screen reader will be notified of the change in text, and if it wants to, notify the user immediately of the error. The user can then fix the error at that time, and not have to deal with it later.

If you need to achieve that same effect in the app when .NET 4.8 is not available, then you’d need to take non-trivial action yourself. For example, use the “LiveLabel” control that I invented a while back. I shared details of that control, and its code, at Let your customers know of important status changes in your WinForms app.

The code for the three points above is shown below.

public partial class Form1 : Form
{
    public Form1()
    {
        InitializeComponent();
    }
 
    private void Form1_Load(
        object sender, 
        EventArgs e)
    {
        this.textBox1.Validating += new
        CancelEventHandler(this.textBox1_Validating);
 
        this.textBox1.QueryAccessibilityHelp +=
            TextBoxAccessibilityHelp;
 
        labelError.LiveSetting = AutomationLiveSetting.Assertive;
    }
 
    protected void textBox1_Validating(
        object sender,
        System.ComponentModel.CancelEventArgs e)
    {
        try
        {
            int x = Int32.Parse(textBox1.Text);
 
            errorProvider1.SetError(textBox1, "");
            labelError.Visible = false;
        }
        catch (Exception ex)
        {
            // TODO: Localize this.
            string error = "The input must be in the form of a whole number.";
 
            errorProvider1.SetError(textBox1, error);
 
            labelError.Text = error;
            labelError.Visible = true;
        }
    }
 
    private void TextBoxAccessibilityHelp(
        object sender, 
        QueryAccessibilityHelpEventArgs e)
    {
        e.HelpString = labelError.Text;
    }

}


I built a tiny test app to demonstrate the above actions. The image below shows the AI4W tool reporting that the UI Automation (UIA) HelpText property for the TextBox is the same as the text shown of the nearby visual label.


The Accessibility Insights for Windows tool reporting the UIA HelpText property being exposed by a WinForms TextBox.

Figure 3: The Accessibility Insights for Windows tool reporting the UIA HelpText property being exposed by a WinForms TextBox.


High level consideration #3

A great deal of UI is designed to be usable for customers who interact with their devices in the same way that the person who’s designing the UI interacts with their own device. So if the UX Designer consumes information through visuals shown on the screen, and inputs at the device using touch or a mouse, then it’s natural for the UX Designer to focus on the experiences involving those specific interaction methods.

Increasingly today UX Designers are considering how by focusing on experiences specifically for customers who interact with their devices as the UX Designers do themselves, many potential customers will receive a poor quality experience, and may even be blocked from performing essential work.

As such, UX Designers can work to create delightful experiences for all customers, regardless of how those customers interact with their devices. For example, through visuals shown on the screen, (including magnified visuals,) audio output from a screen reader, or their customers using a mouse, touch, keyboard, speech, or eye gaze to interact with their device.


Bug 3: My customer is confused by the “role” of one of my controls, (and I don’t know what that means).

Actually, what I’ll describe next isn’t the specific bug that the dev and I discussed, but rather a detail we had to understand before progress could be made on the bug.

The dev had been assigned a bug, and the person logging the bug had discussed the role of a control, where the role would be conveyed to customers using a screen reader. The dev wasn’t familiar with the concept of the “role”, so what does this all mean?

For people familiar with the accessibility of web UI, the role of UI is an old, and very important concept. The role is used as part of building semantic UI, conveying the type of UI to customers. For example, if a piece of UI exposes a role of “Button”, then both your customers and the screen readers they use will have their expectations set on how that UI will behave, and how they can interact with the UI.

So it seems likely that the person who logged the bug against the dev was familiar with web UI accessibility. The dev, however, was working with desktop Windows UI, where from a programmatic accessibility perspective, it’s all about how the UI is exposed through the UI Automation (UIA) interface.

So where’s the role in UIA?

If you were to search for something called “role” amongst all the UIA documentation, you may come across ILegacyIAccessibleProvider::get_Role. I would be very surprised if anyone reading this article would ever need to know about ILegacyIAccessibleProvider, so by default, don’t spend any time researching what that is.

Rather, the issue for the dev with the bug here is that UIA really doesn’t have a property called “role”, rather it has an equivalent property called ControlType, as listed at Automation Element Property Identifiers.

Once the dev and I understood what the person who logged the bug was concerned about, we could use the AI4W tool to confirm that the ControlType properties of the dev’s UI were in fact exposed in the appropriate way.

The image below shows the AI4W tool reporting the ControlTypes of the various controls created for the test app shown above in Figure 3. The ControlTypes are shown in the UIA tree hierarchy, and include “button”, “edit”, “text”, “title bar”, and “pane”.

Interestingly the AI4W tool is also reporting an error relating to my TextBox’s support for the UIA Text pattern. I’ve not looked into this yet, but I’m looking forward to understanding what’s going on there.


The Accessibility Insights for Windows tool reporting the UIA hierarchy for a test app’s UI.

Figure 4: The Accessibility Insights for Windows tool reporting the UIA hierarchy for a test app’s UI.


High level consideration #4

The terminology around accessibility can vary greatly depending on the type of UI being used, and that can sometimes lead to significant time being lost when resolving accessibility bugs. In this case, the person who logged the bug used terminology which seemed to be unrelated to the UI against which the bug was logged. Some time back, I wrote Common approaches for enhancing the programmatic accessibility of your Win32, WinForms and WPF apps: Part 1 – Introduction, which begins by listing many UI framework-specific accessibility-related terms.

An important detail here is that Windows desktop devs being assigned bugs logged by people who are familiar with web UI, may need to be able to make the mapping from discussions around the “role” of UI, to the UIA ControlType.

A couple of years ago I found web UI devs similarly finding it a challenge when they’re told that their UI doesn’t support expected UIA Control patterns. So I wrote Making the connection between HTML and UI Automation Patterns, to help devs consider what that all means.

I’d strongly recommend that devs are aware of the documentation that can help them resolve the accessibility bugs on their plates, and which is specific to the type of UI that they’re building. When you have an accessibility bug assigned to you, you really don’t want to be going to the web, and searching for (say) “making buttons accessible”.

Some dev resources you might want to consider are:


And for a very quick introduction to UI Automation, check out these videos:


Summary

Whenever an accessibility bug is fixed, it’s worth pausing for a moment to consider how that bug could be avoided in the future.

Perhaps it was a genuine dev bug. Maybe there was a typo entered by the dev which wasn’t caught during code review and compilation. Or the dev felt under so much pressure given the long list of bugs on their plate, that they forgot to add some markup to a control’s definition, resulting in a problematic experience for their customers.

Or was the bug indicative of a more systemic issue? Had the feature spec supplied to the dev described a great experience for some customers, yet left other customers underserved? When the UI that the dev was told to build was later tested and found to be problematic, is it then the dev who’s told to understand the problem, research possible solutions, and implement whatever solution seems most appropriate to them?

If the dev is supported by the team, the team’s processes are updated to learn from accessibility bugs that the dev has had to deal with, and so reduce the likelihood of the bugs happening again.

That’s good for the dev, the team, and the customer. It means that less time will be spent in the future addressing avoidable bugs, and more time can be spent building features which can help empower every person to achieve more.

Guy

To view or add a comment, sign in

More articles by Guy Barker

Others also viewed

Explore content categories