Designing keyboard interactions to improve accessibility: the eCare Monitoring use case

ENGIE Digital
5 min readJun 7, 2022

By Céline Bouillon

Last February, I shared an article to advocate for accessibility by design. If you don’t know what accessibility is, or have any doubt about its importance, feel free to read this paper first. Don’t hesitate to share your thoughts and engage in the conversation! In this follow-up, as promised, I would like to share how we’ve been working on including keyboard navigation at eCare.

Confession: we (too) used to not take keyboard into account

When we started working on improving accessibility for eCare Monitoring — a piece of software which enables experts to remotely monitor connected installations — some of our blinders needed to be removed. Amongst these was how people would navigate the Web. We used to design and develop mainly thinking about mouse navigation. And by doing so, we were totally forgetting about keyboard users and creating massive accessibility issues.

You might wonder what the link is between keyboard and accessibility. Some users, with or without disabilities, rely heavily on keyboard navigation. For instance, you can think of:

  1. people using a screen reader (e.g., with visual impairments or learning disabilities) — which may be coupled with a keyboard to navigate from one focusable element to the next,
  2. individuals who have difficulties using a mouse — it requires fine motor skills,
  3. power users who are used to relying on a keyboard for efficiency…
A desktop keyboard with a refreshable Braille display
People with visual impairments may use a screen reader along with a keyboard, or even a refreshable Braille display, to navigate the web. Photo credits: Unsplash

Unfortunately, keyboard navigation keeps being trampled over notably because:

  1. the focus indicator is seen as an unsightly addition to user interfaces (UI) and is consequently hidden in the development phase,
  2. knowledge is lacking on how people interact with components (e.g., links, buttons, carousels…) when they use a keyboard,
  3. designing and implementing these interactions are seen as bringing unnecessary complexity to the table.

This was the case at eCare. Missing keyboard usage entailed that some people were just not able to use our software. As I wrote in my first article, I think empathy is key in design. And here was an opportunity to learn and work on this overlooked type of navigation to make our software more inclusive.

Bringing elements of keyboard navigation to our design process

To add keyboard navigation into eCare Monitoring’s experience, I first worked on making the focus indicator visible. To do so, I leveraged an excellent article written by Sara Soueidan: A guide to designing accessible, WCAG-compliant focus indicators. I mixed her takeaways on contrasting areas, based on the Web Content Accessibility Guidelines (WCAG), with Fluid — ENGIE’s Design System to try and find the “right” focus indicator for various components we use in our software.

Six button design attempts at finding the right balance for the focus indicator
Here is an example of my attempts at finding the right balance (contrast, solid outline width, outline offset…) for our focus indicator. The one on the bottom right, with its widest offset and solid border, seemed to be the most visible

Once the focus was visible, I needed to design keyboard interactions for complex components (e.g., a listbox) to help and guide further front developments. As designers and developers, we can count on a technical specification, WAI-ARIA — which stands for Web Accessibility Initiative — Accessible Rich Internet Applications — to help us define user-expected keyboard interactions.

Including these interactions to eCare Monitoring’s user experience required back-and-forth discussions between several roles in the team: design, accessibility expertise and development. This was especially true for custom combinations of complex components. The help of an accessibility expert was critical here so that we could benefit from their past hands-on experiences. These included impact on development and “real-life” usage of complex components.

The following process would typically unfold:

  1. The accessibility expert and I would work on keyboard interaction proposals,
  2. We would discuss and challenge our proposals iteratively to land on a common one that would be specified along the way. The key here was to find the right balance between User Experience (UX) and development complexity,
  3. Developers would be included to refine our common proposal,
  4. Support would be provided once the component was in the development phase — some interactions could be adjusted if needed.

The first step was quite a challenge, given that I had few knowledge on keyboard experience at the beginning. Here is how I would work on my design proposals:

  1. Learn more about keyboard interactions and try to navigate on my own, based on components I knew were accessible. For instance, this is the case in official design systems like the US Web Design System, the Gov UK Design System and let’s not forget about the Système de Design de l’État
  2. Deep-dive into relevant complex components and their associated WAI-ARIA authoring practices
  3. Prototype various interactions in figma (keyboard interactions are indeed supported 🎉) and check for consistency in terms of user experience
Interaction details for a given complex component in Figma with a key combination as trigger
This is an example of interaction details for a given complex component in figma. The use of Shift+Tab — so that the user may move focus back to the previous focusable element — is prototyped here

Let’s take a more concrete example to illustrate the whole process. In eCare Monitoring, a given category of filters helps experts select one type of connected installations in a data table. This select filter could be perceived as a button triggering a list of buttons, seeing how pressing one in the list would trigger the filtering of the table. Alternatively, this filter looks like a dropdown, so some keyboard users might expect to navigate a list of options. These 2 proposals emerged in the discussions and seemed to be technically feasible. So, we decided to integrate, prototype and specify both ways of interacting:

  • in a list of buttons — using Tab to move focus to the next button in the list, for example
  • in a collapsible dropdown — using the down arrow key to move from one option to the next for instance
An example of filter on types of installations in eCare Monitoring. It could be a list of buttons or a list of options in a dropdown menu
User-expected keyboard interactions will depend on how the component may be perceived (visually, by hearing or touch). Here this filter “type of installations” (boiler, photovoltaic panels, heat pump) could either be a list of buttons or a list of options in a dropdown menu

User testing is crucial…and we are not there yet

I believe this adjustment in our production process was important and worthwhile. Constant discussions between different roles in the team and accessibility expertise proved key to ensure experience consistency in our monitoring service. It is worth noting though that this is just one step in our process. There are still several steps in which accessibility could be improved and embedded.

I think user testing is the next phase we should enhance. For example, include keyboard and assistive technologies (AT) users to collaborate in our prototyping phase, and once the experience is live, could validate our proposals even further. I believe that relying solely on accessibility expertise is not enough and represents a bias to mitigate with additional user feedback. After all, what is design without testing, bias mitigation and iterations?

And you, how do you proceed? Do you prototype and test all types of interactions? Don’t hesitate to share if you have any advice or thoughts on this! 😊

--

--

ENGIE Digital

ENGIE Digital is ENGIE’s software company. We create unique software solutions to accelerate the transition to a carbon-neutral future.