Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Customer Experience ›› Design for the Blind

Design for the Blind

by Oscar Gonzalez
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Tips on how to design for the blind.

If you ever had to describe a User Interface to someone over the phone, you know how frustrating that is for both. It is way easier to share a screenshot or do a screen share. However, if you are blind and using Assistive Technologies like Screen Readers (VoiceOver or TalkBack), this is your reality.

I recently prepared for the Web Specialist Specialist or WAS technical examination (a certification provided by the International Association of Accessibility Professionals (IAAP)), and I wanted to create a series of posts calling out some of the most profound things I learned — this is the first one talking about things to keep in mind when designing for “Audio-Structural Interaction UX.”

In their most raw form, apps are visually appealing and dynamic Word Processors. Something close Notion.

Try deactivating the CSS on your web app or site, for example…

process of showing how to take css out of a web page, in this case Apple.com
Apple.com without CSS is just like reading a written document

Now, look at a typical Word Doc below. As a sighted user, you tend to navigate the document visually and read it in a particular and predictable order. Blind people navigate the content in very similar ways. However, they use Assistive Technologies that read the content and provide them with contextual cues on where they are in the “Document” and what they can do if their input is necessary.

an example of a word document and its taxonomy
Structure of an ordinary Word Processor like Google Docs.

Imagine trying to read a Word document with no headings, no hierarchy, or structure. The document would, in most cases, be impossible to understand. Imagine trying to book a flight in a document like this.

In Summary

Unfortunately, blind people are sometimes limited to browsing/consuming written content online. Things that sighted users do quickly and easily, like buying that plane ticket or posting content online, are incredibly complex and, in most cases, impossible for blind people. Most web and mobile apps are not designed or constructed with this in mind. Let’s change that by keeping the following things in mind:

  • All content must be presented in text or via a text equivalent (like images or other non-text objects should have alt text if they are not decorative) since screen readers cannot read non-text content (images, icons) directly. But they can read the ALT text that you provide.
  • All functionality must be available using only the keyboard. Even though most blind users can physically use a mouse or trackpad, they can’t see the mouse pointer.
  • The content must use markup with good structure and semantics(headings, landmarks, tables, lists, etc.). Just like the “Table of Contents” on a Word Doc, screen reader users often pull up lists of headings, landmarks, and other semantic elements to help them understand what is on the page. They can also navigate by these elements (jump directly to the main content landmark, or to a specific heading). Always use a coherent order for each UI element on the page (Focus Order), and also do not forget to call out custom Landmarks.
  • All custom elements that do not come native with the OS (like expand/collapse buttons, dialogs, etc.) must have the correct name/label, role (either with HTML or with ARIA), and value, and must change value when appropriate. Unlike native HTML elements, custom controls have no semantic parts natively, so screen readers can’t tell users what the element is and can’t update users on its properties unless you give them that information via ARIA, roles, states, and properties. This is the one teams get wrong the most. You can technically be compliant with the established guidelines, but if you fail or do a poor job here, your app is not really accessible. Try to use the native elements from the OS, as they provide most of what you need.
  • Users must receive immediate feedback after all actions they take or if the system is doing something in the background (Live Regions), via their screen reader. Silence after doing anything is always bad

Although all of these things are primarily beneficial for blind individuals using assistive screen readers, keep in mind that drawing parallels with how we experience a Work Doc allows us to construct a more coherent information architecture for a page and/or task which benefits everyone. If the presented content makes sense in a word doc, you are on the right track.

I wrote a post about how to improve your Interaction Design Skills that is super related to this

As a side note, keep in mind that you should consider many other things when designing and implementing for Audio-Structural experiences to make them not only compliant with guidelines but also inclusive and accessible. These are just suggestions that can get you close.

post authorOscar Gonzalez

Oscar Gonzalez,

I am a self-taught generalist. I’m currently a Designer at LinkedIn but also code, and sometimes I write. I am also a certified technical accessibility specialist (WAS) and a web3 and open-source contributor since 2017.

Tweet
Share
Post
Share
Email
Print
Ideas In Brief

The author explains how to design for people using Assistive Technologies and offers 5 tips.

Related Articles

Explore how AI is transforming UX design jobs! Discover the evolving roles, skills, and opportunities shaping the future of the industry.

Article by Christopher Nguyen
The UX Job Crisis: Is This the End or a New Beginning?
  • The article addresses the impact of AI on the UX job market, highlighting the necessity for designers to integrate AI skills, stay updated on tech trends, and promote UX in business strategies.
  • Despite challenges like layoffs and reduced job openings, adapting to AI is crucial for thriving in the evolving landscape.
Share:The UX Job Crisis: Is This the End or a New Beginning?
3 min read

Discover the future of user interfaces with aiOS, an AI-powered operating system that promises seamless, intuitive experiences by integrating dynamic interfaces, interoperable apps, and context-aware functionality. Could this be the next big thing in tech?

Article by Kshitij Agrawal
The Next Big AI-UX Trend—It’s not Conversational UI
  • The article explores the concept of an AI-powered operating system (aiOS), emphasizing dynamic interfaces, interoperable apps, context-aware functionality, and the idea that all interactions can serve as inputs and outputs.
  • It envisions a future where AI simplifies user experiences by seamlessly integrating apps and data, making interactions more intuitive and efficient.
  • The article suggests that aiOS could revolutionize how we interact with technology, bringing a more cohesive and intelligent user experience.
Share:The Next Big AI-UX Trend—It’s not Conversational UI
5 min read

Curious about the next frontier in AI design? Discover how AI can go beyond chatbots to create seamless, context-aware interactions that anticipate user needs. Dive into the future of AI in UX design with this insightful article!

Article by Maximillian Piras
When Words Cannot Describe: Designing For AI Beyond Conversational Interfaces
  • The article explores the future of AI design, moving beyond simple chatbots to more sophisticated, integrated systems.
  • It argues that while conversational interfaces have been the focus, the potential for AI lies in creating seamless, contextual interactions across different platforms and devices.
  • The piece highlights the importance of understanding user intent and context, advocating for AI systems that can anticipate needs and provide personalized experiences.
Share:When Words Cannot Describe: Designing For AI Beyond Conversational Interfaces
21 min read

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and