Web4All Conference Notes – Day 3

Crowd sourcing accessibility evaluations

2013-6: 350 government web sites and 2,000 non-government sites have been evaluated for accessibly in China

conformance testing included

  • Automatic Evaluation
  • Manual Assessment

Crowd sourcing can integrate the power of crowds to solve the manual assessment bottleneck

It was proposed in 2006 and has been used in reCaptcha, spoken wikipedia, labeling.

the current crowd sourcing is not suitable for web accessibility because the assessment tasks require a high level of expertise and experience.

There was an assignment of tasks. The results were compared to

  • total work
  • time out
  • give up
  • errors detected.

An algorithm was developed to compare these values to determine a cost model. This allows them to look at historical data to find that a person is more efficient at one of the rulesets. For instance a completely blind person may be great at form labels but not at color contrast.

Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case

Location-based services (LBS)
Untitled

  • many LBS are available  thanks to smart phones
  • provide turn by turn navigation support using vocal instructions
  • we know little about what environmental elements an features are useful, such as tactile paving or braille buttons

The did a survey of taxonomies

Looking at these data sets, they created a simplified taxonomy based on their similarities

  • Pathways
  • doorways
  • elevators
  • venues
  • obstacles (not included in the previous taxonomies)

These elements defined by their fixed positions within floor map. Vocal instructions use this information to generate vocal instructions. Locate tactile paving:

  • “proceed 9 meters on braille blocks, and turn right”
  • “proceed 20 meters, there are obstacles on both sides”

Announcements of obstacles and tactile paving was confusing and unnecessary for one guide dog user.

Do web users with autism experience barriers when searching for information within web pages

The study looked at eye gazing to see if there was a difference between two groups: with and without autism.

With a series of search tasks, the group with autism had less success than the control group for completing the tasks.

tracking the eye gaze. Five elements: a, b, c, d, e. Their eye map could be a-b-c-e-d

Check the variance between the two groups.

DysMusic

The #DysMusic study is creating a language independent test for detecting #dyslexia in children. #w4a2017 @luzrello

Most dyslexia detection tools are still linguistics based, which isn’t appropriate until the child is already 7-12 years old. This study tries to find a detection method that is non-language based, this would allow detection at a much younger age.

There is a memory game with music elements.

Tasks

  • Find the matching sounds
  • distinguish between sounds
  • short time interval perception

Raw sound is modified via frequency, length, rise time, rhythm.  Only one property is modified at a time. People with dyslexia tend to have trouble detecting rise time changes.

Accessibility Challenge

Producing Accessible Statistics Diagrams in R

Data visualization is increasingly important. R is an existing language for statistics. Jonathan (co-writer) had been using R to output printed diagrams of statistics. They worked together to convert R into an accessible SVG format

Histograms and Boxplots were discrete data presentations  targeted layout for the initial project. Time series and scatter plots are continuous data graphs

Extract the important data points, convert it to an xml document, and attach this to the SVG. The final experience provide easy navigation (arrow keys), supports screen readers via aria live regions.

GazeTheWeb

GazeTheWeb is a simplified browser designed for eye tracking navigation. #w4a2017 #a11y

Math Melodies

Math Melodies makes math easier to learn for children that are blind or low-vision. Math exercises as puzzles, audio icon maps, different exercises. It was funded via crowdfunding and has been downloaded 1400 times

NavCog

NavCog is a navigation project from CMU for blind individuals. It uses low energy blue tooth beacons.

Installation of the beacons is not scalable across large areas. To crowd source the task, they created a set of instructions to walk through the process of configuring and installing the beacons.

LuzDeploy

LuzDeploy is a Facebook messenger bot: easy to use.

VizLens:

VizLens is a crowd sourced interpretation of interfaces, such as microwave oven. Multiple volunteers are recruited to generate labels for the interface. the app then uses augmented reality to virtually overlay the labels.

Chatty Books

Chatty Books is an html5 + Daisy reader that creates an audio version of documents. It can now convert from pdf to multimedia Daisy.

  1. PDF – NiftyReader (text)
  2. export to multimedia daisy or epub3
  3. drag and drop to chatty books, the daisy player and library
  4. upload daisy content to chatty books service (cloud) and use chatty books app on iPad

Able to read my mail

Simplified email program for people with learning and intellectual disabilities. Gmail plugin that converts to simplified text or pictograms.

Closed ASL Interpreting for online videos

Created a framework for incorporating an interpreter. Closed Interpreting, instead of Closed captioning.

the interpreter window needs to be flexible to allow the user to move it around and change size to reduce distractions. IT’s closed, so i can be turned on/off

Moving the eyes back and forth for long periods of time can be exhausting. so the window can be moved to be closer to the screen’s content.

eye-gaze tracking to pause the video when looking away from the video.

Closed Interpreting [CI]

Provide a video interface that allows closed interpreting, like closed captioning. The interface provides a second screen that includes an ASL interpreter

The users appreciated the ability to customize the interpreter’s location. They also liked the ability to pause the interpreter as the gaze moved from content to the interpreter

Web4All 2017 notes for April 3, 2017

Microsoft’s Inclusive HIring

Microsoft’s David Masters started the day with a keynote discussing Microsoft’s Journey Towards Inclusion

Microsoft has had a cultural shift in the last 18 months.
Their new mission statement is:

Empower every person and every organization on the planet to achieve more

Continue Reading Web4All 2017 notes for April 3, 2017

Notes from Web4All 2017 Day 1

Web4All 2017 kicks off with several talks about the Gig Economy, Remote employment, and the current employment opportunities for people with disabilities in Australia and around the world. Continue Reading Notes from Web4All 2017 Day 1

 

Web4All 2017 Day 1 Notes

The Australian Human Rights Commission has begun a study on employment discrimination for the older workers and those with a disability.

While about a quarter of the population is older, they make up just 16 per cent of the workforce. Australians with a disability make up 15 per cent of the working age population, but only 10 per cent of them have jobs.

The inquiry will seek to identify the barriers that prevent people from working, and in consultation with employers, affected individuals and other stakeholders establish strategies to overcome these barriers.

Willing to Work

The Australian government has historically had a higher unemployment rate for PWD than other countries.

Web Accessibility National Transition Strategy was influenced by the work done in European countries. Unfortunately, the tools were not accessible when it launched. So people with disabilities had trouble accessing the participation forms.

Web Accessibility National Transition Strategy PDF version
Continue Reading Notes from Web4All 2017 Day 1

Mystery Meat 2.0 – Making hidden mobile interactions accessible

Mystery Meat 2.0

  • Ted Drake, Intuit Accessibility
  • Poonam Tathavadkar, TurboTax
  • CSUN 2017
  • Slides: slideshare.net/7mary4

This presentation was created for the CSUN 2017 conference. It introduces several hidden interactions available within Android and iOS. Learn how these work and how to make them accessible.
Blue Bell Ice Cream Blue Bell Ice Cream web site with mystery meat navigationis a classic example still live on the web.

The user must hover over the different images to see what they represent. It uses an image map and lacks alt text.

Android Touch and Hold

A.K.A.: Android’s Right Click or Android Long Press to Add context-specific menus

  • Default: Touch and hold
  • With TalkBack:
    Double tap and hold to long press

Mint Transactions

This short video shows how you can use the touch and hold/long press to quickly change category or merchant name within the Mint application

Developers

  • onLongClick: Called when a view has been clicked
    and held
  • Define and Show your menu
  • Not for vital interactions.
  • This is a short
    cut.

It is possible to modify the default notification to the user

iOS 3D Touch

iOS 3D Touch was introduced on the iPhone 6S. It detects the pressure a person applies to the screen with their finger. I light touch is seen as a tap. A medium touch will
trigger a peek view. A continued firm touch will launch the peek’s content into a full screen.

This also allows a person to trigger a shortcut menu on app icons.

  • Peek:
    Quick glance at relevant information and
    actions
  • Pop:
    Open full content previewed in the Peek
  • Quick Actions:
    Custom task list from app icon

User Experience: A light press opens a hovering window so you can “Peek” at the content. When you press just a little bit harder, you will “Pop” into the actual content you’d just been
previewing in a Peek.

Developer info

Quick Actions

This short video shows how 3d touch is also available via the app’s icon for quick tasks.

Pressing and holding the ItsDeductible icon will trigger a menu with customized tasks. App Icon Developer resources

Developers

Swipe Revealed Actions

Alternative actions allow users to quickly make changes without having to open a detail screen. For instance, they can delete a transaction or change an email’s status. The standard interface is to display the options when a user swipes a row to the left. For voiceOver users, the options are announced as alternate actions

It’s Deductible Actions

This short video shows how the alternative actions menu is used in standard mode and how VoiceOver announces the options.

In iOS, editActionsForRowAtIndexPath defines the actions to display in response to swiping the specified row

  • Accessible by default
  • Define:
    • Target Action and Response
    • Visible Text
    • Background color

Swipe Based Navigation

TurboTax uses a custom swipe based navigation between views. It lacks button or suggestions to move back and forth. User testing has showed it to be effective for
sighted users, but required some extra work for accessibility.

Default Experience With VoiceOver

The default experience on Turbo Tax uses a custom swipe gesture that lacks navigation buttons.

TurboTax detects a user’s Screen Reader/Switch Control status to show navigation buttons on Android and iOS

This video shows the default and VoiceOver/SwitchControl experience.

Notice in the standard experience how the screen tracks the user’s finger movement. This is not a standard swipe gesture, so it will not work with VoiceOver enabled.

We detect VoiceOver and SwitchControl is running to display alternate back and continue buttons

Swipe Navigation

  • Animated transition between views
  • Next and Back flow with every screen
  • Eliminates navigation buttons
  • No buttons? Accessibility?
  • Have I reached the end of the screen?

Instead of a basic swipe gesture, this interface tracks the movement of the finger across the screen. This allows the animation to match the user’s finger speed for a more
natural transition.

However, the finger touch is intercepted by VoiceOver, so the custom navigation does not work when VoiceOver is enabled.

Detect Accessibility ON

UIAccessibility kit provides two methods

True == Show Navigation Buttons

These booleans always return true or false. We use this to insert navigation buttons into the screen.

State Change Notification

This does not solve for the more complex of when the user decides to turn it on/off in the middle of the flow of the application for changes to take place dynamically.

For that as well, iOS has great support. Fires an accessibilityChanged event that helps detect changes even when the user is in the middle of the flow and chooses to
turn voice over on/off.

User enables VoiceOver while on a screen

Detect the status change

TurboTax Helper Function

  • How can we refactor code to detect any
    accessibility related settings and address them
    together?
  • Helper function to the rescue!
  • NSNotificationCenter adds observers to track any
    settings that may require us to show buttons.
  • This is an OR logic. Example – if voice over OR
    switch control status changed, display buttons.

Code specifics

  • Boolean is assigned a value – true if buttons need to be shown.
  • Consider a React Native project, all this happens in the native code side (Objective C). This boolean is then handed over to the JAVASCRIPT side since it is not feasible
    for Javascript to get information directly from the device.

Accessibility Data Metrics Survey Results

Survey PointIn preparation for an upcoming talk at CSUN 2017 on data metrics for accessibility, I created a survey for fellow accessibility managers to share what they are doing to quantify the accessibility of their products and services. The following is the raw survey results and I will continue working with the data and responses as we prepare for the presentation: Accessibility Data Metrics and Reporting – Industry Best Practices

Some Initial Observations:

  • Some of the questions were confusing. This is especially true when asking about percentages, as we may know the number of events, but could never compare them to an unknown larger set.
  • WCAG conformance is the starting point for many teams. This includes closed captioning support and issues by category.
  • Most people track the priority of issues, but how is this determined? Is it the priority set in bug tracking software or determined by automated tools?
  • Compliance is a key metric for product managers (VPAT, AODA, and CVAA)
  • Employee and customer participation with open source, conferences, training, and beta testing is a key opportunity for metrics.
  • Few companies are using Net Promoter Score for customer satisfaction regarding accessibility.

You can begin understanding your company’s progress by harnessing the data within your bug tracking software. This article has more information: Accessibility Metrics and JIRA.

Question 1:  Which of the following product quality metrics are you currently tracking? (select all that apply)

Ranked by popularity

  1. Issues by Priority – 77%
  2. Issues by WCAG category – 54%
  3. last time product had a manual evaluation – 46%
  4. Closed Captioning support – 38%
  5. Number of accessibility bugs opened per month/year – 31%
  6. Test coverage for a product – 23%
  7. Number of accessibility bugs marked as blocked – 23%
  8. Automated tests pass/fail – 23%
  9. Number of accessibility bugs closed per month/year – 23%
  10. Percentage of UI test cases written within a time period/release that test for explicit accessibility requirements (e.g. keyboard, ARIA,  markup…) – 15%
  11. Average time to close new bugs – 8%
  12. Most common errors detected via automated testing – 8%
  13. Error density per page (# of issues/size of page) – 0
  14. Percentage of UI tickets closed within a time period/release that were tagged with accessibility (had accessibility requirements) – 0

Question 2: Which of the following metrics do you track for your company’s accessibility Management (select all that apply)

Ranked by popularity

  1. AODA compliance – 55%
  2. Number of media inquiries about product/company accessibility – 45%
  3. Conference talks by employees on accessibility – 45%
  4. Product VPAT Coverage – 45%
  5. Compliance towards 21st Century Communications Act (CVAA) – 36%
  6. Status of PDF Accessibility – 36%
  7. Training completion per team/engineer – 27%
  8. Percentage of company-paid conference attendees attending accessibility conferences – 27%
  9. Customer empathy training completion – 18%
  10. Percentage of employees that have self-disclosed as having a disability (section 503) – 18%
  11. Number of times product developers used a screen reader or other AT 9%
  12. Open source contributions related to accessibility – 0

Question 3: Customer-focused data metrics

  1. Percentage of usability participants who have a disability – 63%
  2. Quantity of feedback via customer support channels – 50%
  3. Percentage of usability activities including user personas with disabilities – 38%
  4. Customer support calls by product/technology 38%
  5. Number of tickets filed by customer support – 38%
  6. Percentage of beta testers with a disability – 25%
  7. Customers who use our products with AT – 13%
  8. Net Promoter Score for customers with a disability – 13%