Learn from extraordinary people working to advance accessibility through innovation. Acclaimed neuroscientist David Eagleman is translating and feeding data through the skin using wearable tech. Haben Girma has refused to let the fact that she is deafblind limit her and believes that inclusion is a choice. Jyotsna Kaki lost her sight as a young adult and is now an Accessibility Testing Program Manager at Google. All are dedicated advocates for accessible technology.
This presentation is not about reducing your support for blind and low-vision. It’s about building better products by expanding your outreach. This topic could be considered controversial, but that’s not the intention. This is about expansion. Focusing on screen reader accessibility has distinct advantages for product developers. If your application works with a screen reader, it should also be usable with a keyboard, voice recognition, and switch control devices. Screen reader accessibility also falls in line with automated testing tools. However, there are many disabilities, and assistive technologies, that are not necessarily beneﬁted by this focus on the blind/low-vision community. Color contrast, closed captioning, readability, consistency in design, user customization, session timeouts, and animation distraction are just a few examples of concerns that often go unaddressed.
The Original Tweet
Watching a blind advocate tell someone with another disability to center blind issues first and wait for the benefits to trickle down. Wow.
Trickle Down Theory suggests economic growth beneﬁts all members of society. The focus is on tax beneﬁts for corporations and the higher income population, as they have the potential for making larger impacts in economic growth. Providing ﬁnancial incentives to this population will, in theory, eventually result in higher prosperity for all.
I don’t want to make products FOR people with disabilities. I want to make products WITH people with a disability FOR everyone -Peter Korn, Director, Accessibility, Amazon Lab126
This is a positive method of trickle down, as this will provide support for all users. For example: Using a readable sans-serif font supports those with low-vision and print disabilities. It also trickles down to making the page readable for the complete user base. Improving readability of text in the Web is one of the most simple and effective ways to improve usability and ease access to information – also for people with special needs, such as elderly people, or people with print disabilities, such as people with low vision or dyslexia.
Plutchik’s Wheel of Emotions includes the level of emotion, not just the emotional category. Learn to accurately track user experience. Primary emotions: anger, anticipation, joy, trust, fear, surprise, sadness and disgust But these also have varying degrees of intensity. If left unchecked, emotions can intensify.
Tone of Voice
I have to enter the same information again
Sighs (keep track by using ~ )
Laughter (nervous and genuine)
Rolling eyes, Scrunching of the nose
Moving in chair, circling with mouse
Learn more about understanding user testing cues from Tragic Design
In 2016, Members of the QuickBooks Online team visited Mozzeria in San Francisco to learn more about deaf entrepreneurship. Key learnings were the diﬃculty in ﬁnding an accountant and managing deliveries. ProAdvisors are accountants that have proven their experience with Intuit’s accounting software. Intuit provides a proﬁle and search capability for each member.
In 2017, Languages was added to the proﬁle and Sign Language is one of the options. This will make it easier for people to ﬁnd an accountant that speaks their language. While driven by learnings from Mozzeria, it will help all of our customers that speak English as a second language. For instance, Deaf Tax.
xBox Emoji Keyboard
Microsoft’s user research found Deaf gamers wanted more expression with their multi-player game communication. This led to the introduction of emojis.
John McWhorter, an American linguist professor at Columbia University, says emoji are not a language on their own, but they make our thoughts more complete. “They add on a part of language that often gets lost in writing, the expressive and personal part,” he says.Emoji Aren’t Silly—They Could Actually Help the Deaf
Gathering actual data, through data mining and through direct communication with your clients who have disabilities, is the only way to truly provide accessibility. You can sit in an oﬃe and brainstorm what issues a client might have all day long, but you will not, by doing this, get the same quality of insight into real world problems faced by people with disabilities who are trying to use your products unless you get that information directly from them and from their experiences. You cannot imagine all of the possible issues and barriers without their experience and their help. During the research phase of the project, the same thing kept being brought up by the deaf business owners. When they had problems with wanting to know how to use a product, they did not want to call a phone number, they strongly preferred being able to use live chat on our website. They both identiﬁed a barrier and the solution that they preferred.
Data Mining Keywords
When using keywords you need to use a wide variety because people do not use one uniform term to apply to any one disability. For example, a person could use “low vision”, “visually impaired”, and “blind” to refer a visual disability. Download and contribute to this set of keywords: Accessibility Data Keywords – GitHub
Turn a deaf ear to…
Blind as a bat…
While doing data mining, you need to be careful about what you ﬁnd. Keywords will also pull up things that are not connected to disabilities. For example, in my situation, when I used the term “deaf”, I was startled to ﬁnd a huge number of responses. But upon looking at the data more closely, I saw that the same phrase was being used repeatedly “turn a deaf ear to…” and this was not connected to any disability.
What does accessibility mean?
Making a product or service accessible to a person with a disability should not mean forcing the person with the disability to adapt to use your concept of what is accessible. Ideally, you should adjust your product or service to ﬁt the abilities of the person with the disability.
For the deaf, not being able to hear may not be the real barrier. For many deaf and hard of hearing literacy is a barrier and language is a barrier. Approximately 44% of the deaf do not graduate from high school. Of those who do, half of all deaf high school graduates read at below a 4th grade level. This means that only about 25% of deaf adults read at above a 4th grade level, and only about 3% read at an 8th grade level. If you want to serve this group, how do you approach the issue of literacy?
Captioning videos is a solution to the barrier presented by not being able to hear the audio, but it is not a solution to the literacy or language barrier faced by many deaf. Think of this as a positive challenge. You have an opportunity to open up your product or service to this group of people through ﬁguring out how to make your product or service accessible to them. I have a funny story for you, and it happens to be true. When I was in high school, in one of my classes, my teacher came up to me and said, excitedly, “I have a video to show to the class, today, and it is captioned so you will be able to understand it!” The video started and I sat there confused. Yes, it was captioned… but in Korean.
The point of this story is that the video was accessible, which was a great thing, but did not ﬁt my needs. Have you thought about instructional videos in ASL? Have you considered the reading level required to access information that you provide? Have you thought about other ways to make things accessible?
Only a few years ago, when I wanted to communicate with hearing people, we would write back and forth on a notepad. This did not work well because most hearing people feel burdened by having to write, and it was time consuming. Now, due to speech recognition apps, such as Microsoft Translator, things have changed. More hearing people are willing to use it to communicate with me. And it is very helpful in work situations. Microsoft Translator’s adoption for the deaf and hard of hearing was influenced by early feedback from a deaf engineer at Microsoft.
There is some room for improvement, but the current generation of speech recognition is vastly superior to only a couple years ago. Apps such as this allow for easier daily integration of the deaf and hard of hearing into conversational situations. This may not seem radical to you, but for most deaf people, workplaces are isolating. They are excluded from both social and work conversations and often given only the barest of recaps and summaries of even important information.
Chronic Pain and Package Design
Ideo planned the packaging for the Quell device to reduce frustration and anxiety. “People with pain have a lot to deal with,” Gozani says. “We want to take away any hassle.” It takes about a minute to set up the device and calibrate the stimulation levels.
Nike FlyEase Shoes
These shoes were inspired by a letter Matthew Walzer wrote to Nike about his desire to wear cool sneakers that he could put on by himself. “My dream is to go to the college of my choice without having to worry about someone coming to tie my shoes every day,” he wrote, according to Nike. “As a teenager who is striving to become totally self-suﬃcient, I ﬁnd this extremely frustrating and, at times, embarrassing.”
Enabling Xbox One to be accessible for everyone: One important area for us with this release is to enable Xbox One to be able to be used and played by everyone.
Take for instance our new Copilot feature which allows two controllers to act as if they were one. This will help make Xbox One more inviting to new gamers needing assistance, more fun by adding cooperative controls for any game and easier for players who need unique conﬁgurations to play — whether that is with hands apart, hand and chin, hand and foot, etc.. We are also adding new enhancements to Magniﬁer and Narrator, as well as giving more options over audio output and custom rumble settings on a controller, which was previously reserved for the Xbox Elite Controller.
The following is the proposal for this presentation. I will publish the final presentation for further details.
Trickle-Down Accessibility Proposal
Trickle Down Economics suggests economic growth benefits all members of society. The focus is on tax benefits for corporations and the higher income population, as they have the potential for making larger impacts in economic growth. Providing financial incentives to this population will, in theory, eventually result in higher prosperity for all.
Matt May’s observation on Twitter in 2016 raised awareness of Trickle Down Accessibility:
“Watching a blind advocate tell someone with another disability to center blind issues first and wait for the benefits to trickle down. Wow. ”
Focusing on screen reader accessibility has distinct advantages for product developers. If your application works with a screen reader, it should also be usable with a keyboard, voice recognition, and switch control devices. Screen reader accessibility also falls in line with automated testing tools.
However, there are many disabilities, and assistive technologies, that are not necessarily benefited by this focus on the blind/low-vision community. Color contrast, closed captioning, readability, consistency in design, user customization, session timeouts, and animation distraction are just a few examples of concerns that often go unaddressed.
Accessibility is an important part of any app. Whether you’re developing a new app or improving an existing one, ensure that components are accessible to everyone.
Why develop for accessibility
1 in 5 people will have a disability in their life. – 2010 census
Designing for accessibility benefits blind, low vision, and eyes occupied (driving)
Android includes 4 types of assistive technology:
TalkBack: Screen reader
BrailleBack: Braille output for refreshable braille devices
Switch Access: switch control of device
Voice Access: control device by voice activation: “scroll up”
Android O’s major focus: increase productivity for users
new api additions for accessibility
print disabilities (reading disabilities)
New to TalkBack
accessibilityVolume: adjust audio volumen for accessibiity independently from media. So you can watch youtube and control that separately than talkback. This is available when talkback is on.
Volume from youtube is quieted while talkback is being used. it then fades back into the foreground. There’s a new accessibility volume slider
New gestures for talkback.
If there’s a fingerprint sensor on the back of the device, it can be used by talkback users. The sensor has its own set of customizable gestures. For instance, swipe up on the fingerprint. These can be assigned, such as longpress action
Quickly enable/disable TalkBack
long press the volume keys to quickly turn on/off talkback.
this works on any screen, this makes it easier to test apps and turn off talkback to type information. Press both keys at the same time for a long press and eventually it will turn on/off talkback. the accessibility shortcut can be assigned to switch control, zoom, or other service.
New Text to speech engine can handle multiple languages. Use LocaleSpan to trigger language switching.
2 new APIs
Continuous Gesture API: enable motor impaired users who use head tracker to perform drag and drop, zoom, etc.
A new accessibility button is located in the navigation bar. This allow users to quickly invoke context dependent accessibility features. This sits in the row with back and home buttons
People with dyslexia, low vision, learning new language… They can now use select to speak. part of talkback 5.2. Select element on screen and talkback will read it. It has a floating action button to enable.
In android o. read whole page, and advanced controls, word level highlighting, set up wizard.
manual testing: try your app with TalkBack and SwitchAccess.
if it is ok in talkback, it should be good for brailleback and select to speak
if it works with switch access, it should also work with voice access.
the current crowd sourcing is not suitable for web accessibility because the assessment tasks require a high level of expertise and experience.
There was an assignment of tasks. The results were compared to
An algorithm was developed to compare these values to determine a cost model. This allows them to look at historical data to find that a person is more efficient at one of the rulesets. For instance a completely blind person may be great at form labels but not at color contrast.
Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case
Location-based services (LBS)
many LBS are available thanks to smart phones
provide turn by turn navigation support using vocal instructions
we know little about what environmental elements an features are useful, such as tactile paving or braille buttons
Looking at these data sets, they created a simplified taxonomy based on their similarities
obstacles (not included in the previous taxonomies)
These elements defined by their fixed positions within floor map. Vocal instructions use this information to generate vocal instructions. Locate tactile paving:
“proceed 9 meters on braille blocks, and turn right”
“proceed 20 meters, there are obstacles on both sides”
Announcements of obstacles and tactile paving was confusing and unnecessary for one guide dog user.
Do web users with autism experience barriers when searching for information within web pages
The study looked at eye gazing to see if there was a difference between two groups: with and without autism.
With a series of search tasks, the group with autism had less success than the control group for completing the tasks.
tracking the eye gaze. Five elements: a, b, c, d, e. Their eye map could be a-b-c-e-d
Check the variance between the two groups.
The #DysMusic study is creating a language independent test for detecting #dyslexia in children. #w4a2017 @luzrello
Most dyslexia detection tools are still linguistics based, which isn’t appropriate until the child is already 7-12 years old. This study tries to find a detection method that is non-language based, this would allow detection at a much younger age.
There is a memory game with music elements.
Find the matching sounds
distinguish between sounds
short time interval perception
Raw sound is modified via frequency, length, rise time, rhythm. Only one property is modified at a time. People with dyslexia tend to have trouble detecting rise time changes.
Producing Accessible Statistics Diagrams in R
Data visualization is increasingly important. R is an existing language for statistics. Jonathan (co-writer) had been using R to output printed diagrams of statistics. They worked together to convert R into an accessible SVG format
Histograms and Boxplots were discrete data presentations targeted layout for the initial project. Time series and scatter plots are continuous data graphs
Extract the important data points, convert it to an xml document, and attach this to the SVG. The final experience provide easy navigation (arrow keys), supports screen readers via aria live regions.
GazeTheWeb is a simplified browser designed for eye tracking navigation. #w4a2017 #a11y
Math Melodies makes math easier to learn for children that are blind or low-vision. Math exercises as puzzles, audio icon maps, different exercises. It was funded via crowdfunding and has been downloaded 1400 times
NavCog is a navigation project from CMU for blind individuals. It uses low energy blue tooth beacons.
Installation of the beacons is not scalable across large areas. To crowd source the task, they created a set of instructions to walk through the process of configuring and installing the beacons.
LuzDeploy is a Facebook messenger bot: easy to use.
VizLens is a crowd sourced interpretation of interfaces, such as microwave oven. Multiple volunteers are recruited to generate labels for the interface. the app then uses augmented reality to virtually overlay the labels.
Chatty Books is an html5 + Daisy reader that creates an audio version of documents. It can now convert from pdf to multimedia Daisy.
PDF – NiftyReader (text)
export to multimedia daisy or epub3
drag and drop to chatty books, the daisy player and library
upload daisy content to chatty books service (cloud) and use chatty books app on iPad
Able to read my mail
Simplified email program for people with learning and intellectual disabilities. Gmail plugin that converts to simplified text or pictograms.
Closed ASL Interpreting for online videos
Created a framework for incorporating an interpreter. Closed Interpreting, instead of Closed captioning.
the interpreter window needs to be flexible to allow the user to move it around and change size to reduce distractions. IT’s closed, so i can be turned on/off
Moving the eyes back and forth for long periods of time can be exhausting. so the window can be moved to be closer to the screen’s content.
eye-gaze tracking to pause the video when looking away from the video.
Closed Interpreting [CI]
Provide a video interface that allows closed interpreting, like closed captioning. The interface provides a second screen that includes an ASL interpreter
The users appreciated the ability to customize the interpreter’s location. They also liked the ability to pause the interpreter as the gaze moved from content to the interpreter