Google IO 2017 – Accessibility Notes

Building an accessibility team

The project champion needs to make sure training is available to all people on the team.

  • Create a set of primary and secondary key work paths, i.e. put an item in a shopping cart (first) change avatar image (secondary)
  • Add a checklist and try user testing

Resources and Ideas:

What’s new in Android Accessibility

Accessibility | Android Developers

Accessibility is an important part of any app. Whether you’re developing a new app or improving an existing one, ensure that components are accessible to everyone.

Why develop for accessibility

  • 1 in 5 people will have a disability in their life. – 2010 census
  • Designing for accessibility benefits blind, low vision, and eyes occupied (driving)

Android includes 4 types of assistive technology:

  • TalkBack: Screen reader
  • BrailleBack: Braille output for refreshable braille devices
  • Switch Access: switch control of device
  • Voice Access: control device by voice activation: “scroll up”

Android O’s major focus: increase productivity for users

  • new api additions for accessibility
  • print disabilities (reading disabilities)

New to TalkBack

accessibilityVolume: adjust audio volumen for accessibiity independently from media. So you can watch youtube and control that separately than talkback. This is available when talkback is on.

Volume from youtube is quieted while talkback is being used. it then fades back into the foreground. There’s a new accessibility volume slider

New gestures for talkback.

If there’s a fingerprint sensor on the back of the device, it can be used by talkback users. The sensor has its own set of customizable gestures. For instance,  swipe up on the fingerprint. These can be assigned, such as longpress action

Quickly enable/disable TalkBack

long press the volume keys to quickly turn on/off talkback.

this works on any screen, this makes it easier to test apps and turn off talkback to type information. Press both keys at the same time for a long press and eventually it will turn on/off talkback. the accessibility shortcut can be assigned to switch control, zoom, or other service.

New Text to speech engine can handle multiple languages. Use LocaleSpan to trigger language switching.

2 new APIs

Continuous Gesture API: enable motor impaired users who use head tracker to perform drag and drop, zoom, etc.

Accessibility Button:

A new accessibility button is located in the navigation bar. This allow users to quickly invoke context dependent accessibility features. This sits in the row with back and home buttons

Print disabilities

People with dyslexia, low vision, learning new language… They can now use select to speak. part of talkback 5.2. Select element on screen and talkback will read it. It has a floating action button to enable.

In android o. read whole page, and advanced controls, word level highlighting, set up wizard.

Testing

manual testing: try your app with TalkBack and SwitchAccess.

  • if it is ok in talkback, it should be good for brailleback and select to speak
  • if it works with switch access, it should also work with voice access.

Accessibility Scanner, free app to download on play store. Android's Accessibility Scanner

This analyzes the current screen and provides an audit that can be shared.

The accessibility test framework is still requiring espresso and/or robolectric

Android Accessibility Documentation

Android has a new developer hub for understanding accessibility. There’s a page for Android testing.

Web4All Conference Notes – Day 3

Crowd sourcing accessibility evaluations

2013-6: 350 government web sites and 2,000 non-government sites have been evaluated for accessibly in China

conformance testing included

  • Automatic Evaluation
  • Manual Assessment

Crowd sourcing can integrate the power of crowds to solve the manual assessment bottleneck

It was proposed in 2006 and has been used in reCaptcha, spoken wikipedia, labeling.

the current crowd sourcing is not suitable for web accessibility because the assessment tasks require a high level of expertise and experience.

There was an assignment of tasks. The results were compared to

  • total work
  • time out
  • give up
  • errors detected.

An algorithm was developed to compare these values to determine a cost model. This allows them to look at historical data to find that a person is more efficient at one of the rulesets. For instance a completely blind person may be great at form labels but not at color contrast.

Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case

Location-based services (LBS)
Untitled

  • many LBS are available  thanks to smart phones
  • provide turn by turn navigation support using vocal instructions
  • we know little about what environmental elements an features are useful, such as tactile paving or braille buttons

The did a survey of taxonomies

Looking at these data sets, they created a simplified taxonomy based on their similarities

  • Pathways
  • doorways
  • elevators
  • venues
  • obstacles (not included in the previous taxonomies)

These elements defined by their fixed positions within floor map. Vocal instructions use this information to generate vocal instructions. Locate tactile paving:

  • “proceed 9 meters on braille blocks, and turn right”
  • “proceed 20 meters, there are obstacles on both sides”

Announcements of obstacles and tactile paving was confusing and unnecessary for one guide dog user.

Do web users with autism experience barriers when searching for information within web pages

The study looked at eye gazing to see if there was a difference between two groups: with and without autism.

With a series of search tasks, the group with autism had less success than the control group for completing the tasks.

tracking the eye gaze. Five elements: a, b, c, d, e. Their eye map could be a-b-c-e-d

Check the variance between the two groups.

DysMusic

The #DysMusic study is creating a language independent test for detecting #dyslexia in children. #w4a2017 @luzrello

Most dyslexia detection tools are still linguistics based, which isn’t appropriate until the child is already 7-12 years old. This study tries to find a detection method that is non-language based, this would allow detection at a much younger age.

There is a memory game with music elements.

Tasks

  • Find the matching sounds
  • distinguish between sounds
  • short time interval perception

Raw sound is modified via frequency, length, rise time, rhythm.  Only one property is modified at a time. People with dyslexia tend to have trouble detecting rise time changes.

Accessibility Challenge

Producing Accessible Statistics Diagrams in R

Data visualization is increasingly important. R is an existing language for statistics. Jonathan (co-writer) had been using R to output printed diagrams of statistics. They worked together to convert R into an accessible SVG format

Histograms and Boxplots were discrete data presentations  targeted layout for the initial project. Time series and scatter plots are continuous data graphs

Extract the important data points, convert it to an xml document, and attach this to the SVG. The final experience provide easy navigation (arrow keys), supports screen readers via aria live regions.

GazeTheWeb

GazeTheWeb is a simplified browser designed for eye tracking navigation. #w4a2017 #a11y

Math Melodies

Math Melodies makes math easier to learn for children that are blind or low-vision. Math exercises as puzzles, audio icon maps, different exercises. It was funded via crowdfunding and has been downloaded 1400 times

NavCog

NavCog is a navigation project from CMU for blind individuals. It uses low energy blue tooth beacons.

Installation of the beacons is not scalable across large areas. To crowd source the task, they created a set of instructions to walk through the process of configuring and installing the beacons.

LuzDeploy

LuzDeploy is a Facebook messenger bot: easy to use.

VizLens:

VizLens is a crowd sourced interpretation of interfaces, such as microwave oven. Multiple volunteers are recruited to generate labels for the interface. the app then uses augmented reality to virtually overlay the labels.

Chatty Books

Chatty Books is an html5 + Daisy reader that creates an audio version of documents. It can now convert from pdf to multimedia Daisy.

  1. PDF – NiftyReader (text)
  2. export to multimedia daisy or epub3
  3. drag and drop to chatty books, the daisy player and library
  4. upload daisy content to chatty books service (cloud) and use chatty books app on iPad

Able to read my mail

Simplified email program for people with learning and intellectual disabilities. Gmail plugin that converts to simplified text or pictograms.

Closed ASL Interpreting for online videos

Created a framework for incorporating an interpreter. Closed Interpreting, instead of Closed captioning.

the interpreter window needs to be flexible to allow the user to move it around and change size to reduce distractions. IT’s closed, so i can be turned on/off

Moving the eyes back and forth for long periods of time can be exhausting. so the window can be moved to be closer to the screen’s content.

eye-gaze tracking to pause the video when looking away from the video.

Closed Interpreting [CI]

Provide a video interface that allows closed interpreting, like closed captioning. The interface provides a second screen that includes an ASL interpreter

The users appreciated the ability to customize the interpreter’s location. They also liked the ability to pause the interpreter as the gaze moved from content to the interpreter

Web4All 2017 notes for April 3, 2017

Microsoft’s Inclusive HIring

Microsoft’s David Masters started the day with a keynote discussing Microsoft’s Journey Towards Inclusion

Microsoft has had a cultural shift in the last 18 months.
Their new mission statement is:

Empower every person and every organization on the planet to achieve more

Continue Reading Web4All 2017 notes for April 3, 2017

Notes from Web4All 2017 Day 1

Web4All 2017 kicks off with several talks about the Gig Economy, Remote employment, and the current employment opportunities for people with disabilities in Australia and around the world. Continue Reading Notes from Web4All 2017 Day 1

 

Web4All 2017 Day 1 Notes

The Australian Human Rights Commission has begun a study on employment discrimination for the older workers and those with a disability.

While about a quarter of the population is older, they make up just 16 per cent of the workforce. Australians with a disability make up 15 per cent of the working age population, but only 10 per cent of them have jobs.

The inquiry will seek to identify the barriers that prevent people from working, and in consultation with employers, affected individuals and other stakeholders establish strategies to overcome these barriers.

Willing to Work

The Australian government has historically had a higher unemployment rate for PWD than other countries.

Web Accessibility National Transition Strategy was influenced by the work done in European countries. Unfortunately, the tools were not accessible when it launched. So people with disabilities had trouble accessing the participation forms.

Web Accessibility National Transition Strategy PDF version
Continue Reading Notes from Web4All 2017 Day 1

Mystery Meat 2.0 – Making hidden mobile interactions accessible

Mystery Meat 2.0

  • Ted Drake, Intuit Accessibility
  • Poonam Tathavadkar, TurboTax
  • CSUN 2017
  • Slides: slideshare.net/7mary4

This presentation was created for the CSUN 2017 conference. It introduces several hidden interactions available within Android and iOS. Learn how these work and how to make them accessible.
Blue Bell Ice Cream Blue Bell Ice Cream web site with mystery meat navigationis a classic example still live on the web.

The user must hover over the different images to see what they represent. It uses an image map and lacks alt text.

Android Touch and Hold

A.K.A.: Android’s Right Click or Android Long Press to Add context-specific menus

  • Default: Touch and hold
  • With TalkBack:
    Double tap and hold to long press

Mint Transactions

This short video shows how you can use the touch and hold/long press to quickly change category or merchant name within the Mint application

Developers

  • onLongClick: Called when a view has been clicked
    and held
  • Define and Show your menu
  • Not for vital interactions.
  • This is a short
    cut.

It is possible to modify the default notification to the user

iOS 3D Touch

iOS 3D Touch was introduced on the iPhone 6S. It detects the pressure a person applies to the screen with their finger. I light touch is seen as a tap. A medium touch will
trigger a peek view. A continued firm touch will launch the peek’s content into a full screen.

This also allows a person to trigger a shortcut menu on app icons.

  • Peek:
    Quick glance at relevant information and
    actions
  • Pop:
    Open full content previewed in the Peek
  • Quick Actions:
    Custom task list from app icon

User Experience: A light press opens a hovering window so you can “Peek” at the content. When you press just a little bit harder, you will “Pop” into the actual content you’d just been
previewing in a Peek.

Developer info

Quick Actions

This short video shows how 3d touch is also available via the app’s icon for quick tasks.

Pressing and holding the ItsDeductible icon will trigger a menu with customized tasks. App Icon Developer resources

Developers

Swipe Revealed Actions

Alternative actions allow users to quickly make changes without having to open a detail screen. For instance, they can delete a transaction or change an email’s status. The standard interface is to display the options when a user swipes a row to the left. For voiceOver users, the options are announced as alternate actions

It’s Deductible Actions

This short video shows how the alternative actions menu is used in standard mode and how VoiceOver announces the options.

In iOS, editActionsForRowAtIndexPath defines the actions to display in response to swiping the specified row

  • Accessible by default
  • Define:
    • Target Action and Response
    • Visible Text
    • Background color

Swipe Based Navigation

TurboTax uses a custom swipe based navigation between views. It lacks button or suggestions to move back and forth. User testing has showed it to be effective for
sighted users, but required some extra work for accessibility.

Default Experience With VoiceOver

The default experience on Turbo Tax uses a custom swipe gesture that lacks navigation buttons.

TurboTax detects a user’s Screen Reader/Switch Control status to show navigation buttons on Android and iOS

This video shows the default and VoiceOver/SwitchControl experience.

Notice in the standard experience how the screen tracks the user’s finger movement. This is not a standard swipe gesture, so it will not work with VoiceOver enabled.

We detect VoiceOver and SwitchControl is running to display alternate back and continue buttons

Swipe Navigation

  • Animated transition between views
  • Next and Back flow with every screen
  • Eliminates navigation buttons
  • No buttons? Accessibility?
  • Have I reached the end of the screen?

Instead of a basic swipe gesture, this interface tracks the movement of the finger across the screen. This allows the animation to match the user’s finger speed for a more
natural transition.

However, the finger touch is intercepted by VoiceOver, so the custom navigation does not work when VoiceOver is enabled.

Detect Accessibility ON

UIAccessibility kit provides two methods

True == Show Navigation Buttons

These booleans always return true or false. We use this to insert navigation buttons into the screen.

State Change Notification

This does not solve for the more complex of when the user decides to turn it on/off in the middle of the flow of the application for changes to take place dynamically.

For that as well, iOS has great support. Fires an accessibilityChanged event that helps detect changes even when the user is in the middle of the flow and chooses to
turn voice over on/off.

User enables VoiceOver while on a screen

Detect the status change

TurboTax Helper Function

  • How can we refactor code to detect any
    accessibility related settings and address them
    together?
  • Helper function to the rescue!
  • NSNotificationCenter adds observers to track any
    settings that may require us to show buttons.
  • This is an OR logic. Example – if voice over OR
    switch control status changed, display buttons.

Code specifics

  • Boolean is assigned a value – true if buttons need to be shown.
  • Consider a React Native project, all this happens in the native code side (Objective C). This boolean is then handed over to the JAVASCRIPT side since it is not feasible
    for Javascript to get information directly from the device.