Accessibility Data Metrics and Reporting – Industry Best Practices

The key to good decision making is evaluating the available information – the data – and combining it with your own estimates of pluses and minuses. As an economist, I do this every day.

Emily Oster, Brown University

Three Stages

  1. Accessibility Management Goals
  2. State of the Industry
  3. Solutions to fill gaps

Current State

  • Manual Testing and Evaluations
  • Continual improvements
  • Documentation
  • Community Outreach
  • Stakeholders
  • Centralized Accessibility Leadership

What we have

  • Product database with evaluations, scores, stakeholders, and documents
  • Bug Tracking
  • Scattered Employee Resources
  • Centralized documentation
  • Non-integrated automation

What we need

  • Dashboards to surface data and trends
  • Connections between platforms
  • Automation and unit testing for developers
  • Shared test results
  • Connected resources
  • Support channel integration

Future Goal

  • Measurable life-cycle from tests/evaluation to production
  • Deep consumer feedback integration
  • Diverse workforce with support channels
  • Developer ownership of testing and solutions
  • Product ownership of process and success

State of the Industry

How are we doing?
– Product Manager

Accessibility is something that is not easy to measure. So how do we provide information to our leaders and fellow colleagues to understand the state of our products’ accessibility, our customer service, and support for fellow employees? This talk looks at how people are collecting and distributing this data across the industry. This presentation was developed as we have begun the task of cataloguing and sharing data metrics within Intuit, but much of it is owed to fellow accessibility managers across the world.

What to Measure?

Tablespoon and Teaspoon measurement tattooed onto a Hand
We posted a survey to accessibility managers to find out what data they are collecting and evaluating for success. Complete Accessibility Industry Survey Results

Product Quality Metrics

  • Issues by Priority – 77%
  • Issues by WCAG category – 54%
  • last time product had a manual evaluation – 46%
  • Closed Captioning support – 38%
  • Number of accessibility bugs opened per month/ year – 31%

The most common pattern is to break down the metrics by the WCAG categories and assign a priority to the issues. This is probably done via bug creation. Automated testing tools are also able to establish the priority of issues. JIRA, and other bug tracking software, makes it easy to track priorities, freshness, time to completion and more.

Knowing the last date of evaluation is also critical for understanding the current health of a product. There’s little confidence in a score if it has not been updated in 2 years.

Priority Levels

  • Impact on Customer
  • Template / Module / Unique
  • Density of Issues
  • High traffic pages and key work flows
Related articles

Test to Ticket flow

  1. Product for Analysis
  2. Detailed Evaluation
  3. Accessibility Evaluation Report
  4. Create Stories and Bugs in JIRA (or other bug tracking software)
  5. Discuss Solutions / Resolve Based on Priority

Company Metrics

  • AODA compliance – 55%
  • Number of media inquiries about product/company accessibility – 45%
  • Conference talks by employees on accessibility – 45%
  • Product VPAT Coverage – 45%
  • Compliance towards 21st Century Communications Act (CVAA) – 36%

Compliance is a key metric for product managers (VPAT, AODA, and CVAA) External activity is a key metric, especially media inquiries and participation with external events. Education, Training, and Empathy session tracking are a big opportunity for data collection

Compliance and Legal

  • Solved number of issues around this domain
  • Tracking helps keep record of things
  • Especially when it comes to critical areas like these
  • How much accessible is the product
  • Which of them have a VPAT

Human Resources

  • Section 503: Self-identification
  • Accommodations Requests
  • Ergonomic Evaluations
  • Service/Perk Requests

Section 503: The updated ruling’s goal is for government contractors to have 7% of their workforce be disabled and/or a veteran. This depends on self-identification.

It’s important to track official and non-official accommodation requests. Montana provides a useful summary of what to track for accommodations.

Customer Metrics

  • Usability participants with a disability – 63%
  • Feedback via customer support – 50%
  • Usability activities, including user personas, with disabilities – 38%
  • Customer support calls – 38%
  • Tickets filed by customer support – 38%

Usability studies with people that have a disability are a key metric. Tracking customer support is also popular, but this may need some planning for support agents to apply keywords. Net Promoter Score is under-utilized.

Net Promoter Score

Survey Monkey makes it very easy to create a Net Promoter Score survey to get your customer’s feedback. This could be done for the product in general or for individual releases and features. NPS looks at the balance of detractors, neutrals, and promoters. You want people to love your product so much they tell their friends about it. A negative NPS means people are not happy with your product. Anything positive is good. But a NPS of 50 is excellent and above 70 is exceptional.

QuickBase

QuickBase is a collaborative project management product. It used to be an Intuit product but was spun off recently. We use QuickBase within Intuit to track our accessibility metrics. You can easily create shared databases, store documents, and generate reports.

Pricing starts at $15/month, but it is worth upgrading to the $25/month level.

Spreadsheets

From a recent survey, spreadsheets are a popular method for storing and sharing data on accessibility compliance. Excel provides an accessible presentation for data, macros provide complex data analysis, and the data can be extracted by individual teams for reporting and analysis. Worldspace and other testing tools allow test results to be exported as spreadsheets. These spreadsheets are often stored on internal collaborative spaces, such as Box.

Pa11y Testing + Dashboard

Pa11y is an open-source platform for developing accessible dashboards. You can also use its client and testing server within your testing strategy. Setting up An Accessibility Dashboard from Scratch with Pa11y on DigitalOcean

Google Docs

Google Docs was mentioned by survey respondents. It’s a cloud-based collaborative product. Google Docs also allow you to import data automatically via their API. This could be a great option for surfacing automated test results.

Cloud Storage

These services allow you to store a document and share it across teams. Support varies for versioning and setting security levels.

WorldSpace

WorldSpace was mentioned by several respondents in a poll about accessibility metrics. This is just one of many tools that provide automated testing and reporting. These tools give access to historical data, prioritized issues, categorization by standards, and data can be exported for further reporting.

Solutions

Bug Tracking

Example JIRA dashboard
This dashboard tracks JIRA tickets for accessibility at Intuit. These graphs and lists make it easy to find issues that need assistance and to see how teams are progressing.

Your company’s bug tracking software is a gold mine for metrics, you just need to spend some time configuring queries to get useful information. JIRA Software offers flexible issue and project tracking with best-in-class agile tooling for software teams. It provides bug tracking, issue tracking, and project management functions. According to Atlassian, Jira is used  by over 25,000 customers in 122 countries around the globe (Intuit being one of them).

JIRA Queries
  • labels = accessibility
  • labels=accessibility and status=Open
  • labels=accessibility and status = blocked
  • labels=accessibility and status=Closed and resolved > “-4w”

While this focuses on Jira, which is one of the most popular bug tracking tools, you could do similar queries with Bugzilla and other products. You can’t depend on an a text query for accessibility. This will pull up too many tickets about databases, security, servers, etc. Use the label=’accessibility’ or something similar. Jira Query Language is fairly simple and can generate great results. Using Jira for accessibility management.

This dashboard provides a quick view of accessibility progress across the company. The charts spotlight issues that need attention, such as blocked or closed as won’t fix. It also gives quick view of issues closed by priority or product. Some graphs show changes within the last four weeks, which can reflect product timelines.

DOMO

Intuit teams use DOMO to share various metrics related to accessibility, security, server performance, build validation, and much more. It allows you to consolidate various data sources and users can drill into the data. Amazing simplified visual representation of data Inter-BU visibility, inter-team visibility. It’s important to integrate your accessibility metrics within the same tool that teams are using on a daily basis.

Integrated Testing

  • Developer testing: Tenon, aXe, Nemo
  • Unit tests include accessibility
  • Export test results to JIRA

Making a tool available is not enough. We need to know when it is being used and track issues detected via the tests.

Finding Outliers

Sparkline graph of test results
Sparklines make it easy to see when a project improves or a problems are introduced.
  • Extremely critical to prioritize
  • Outliers need to be identified and fixed first
  • Example – Turbo Tax in season production bugs

It’s critical to detect outliers, such as a jump in faults, as quickly as they appear. Sparkline graphs are a simple method for tracking extreme changes.

JIRA helps when its critical to track what is to be done, how old is it, what is the priority. Also dashboards help to know the latest updates. One misrepresentation of data can impact customers and the business directly.

Customer Support Logs

  • Sample search for Deaf and HOH customer support logs:
    [ LEVEL:SENTENCE TYPE:KEYWORD {deaf, hearing}] 
    [ LEVEL:SENTENCE TYPE:AND_KEYWORD {capability, impaired, challenged, hard}]
  • “Accessibility” topic within Salesforce for reports
  • Microsoft: Dedicated Support Channel

The above search was used in our customer support logs to pull out calls that helped us evaluate our support system and make improvements. Salesforce allows agents to add a topic to a support call. Encourage them to use “accessibility” or other standard topic to better track customer feedback.

Microsoft has created a customer support center specifically for their customers with a disability: Microsoft’s Disability Answer Desk

What Did You Need?

We asked participants to write down questions they need to answer for their teams. The following is the compiled list. How do these match  your expectations and needs?

  • How to track and report on application reviews for a11y
  • Learn ideas for metrics quantification of a11y reporting of progress
  • What does partial compliance really mean?
  • Which tools/dashboard/CI options do we have? How do they integrate? What about walled gardens?
  • What is the severity level breakdown of our a11y defects?
  • What are the right things to measure?
  • How to put dollars saving around a11y?
  • How is progress tracked over time?
  • How accessible is project?
  • How much time to comply?
  • How will it impact velocity?
  • How can we integrate a11y in our workflow?
  • How can we test build user test?
  • How have we improved?
  • How does feature X impact a11y?
  • I love metrics. I want to learn more about how you measure a11y? What you measure and how? You communicate your goals and see how we should consider changing what we do?
  • How many people use our site with screen readers and which ones are they using?
  • What are the methods to identify people that use assistive technologies when browsing on a website? How the metrics about them are collected?
  • How to collect a11y metrics and use them to drive development?
  • I am new to a11y metrics so I am hoping to learn more about what others are using + best practices?
  • I work for an agile shop that likes to make decisions based on data, so I would like to show metrics indicating that we are making progress on a11y
  • How are defects tracked/?
  • Should tracking occur at a general level or technical level?
  • What is the ratio of a11y defects to functional defects?
  • How to scale up the size/power of manual testing?
  • How long do a11y defects take to fix as compared to functional?

Posted

in

,

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *