In preparation for an upcoming talk at CSUN 2017 on data metrics for accessibility, I created a survey for fellow accessibility managers to share what they are doing to quantify the accessibility of their products and services. The following is the raw survey results and I will continue working with the data and responses as we prepare for the presentation: Accessibility Data Metrics and Reporting – Industry Best Practices
Some Initial Observations:
- Some of the questions were confusing. This is especially true when asking about percentages, as we may know the number of events, but could never compare them to an unknown larger set.
- WCAG conformance is the starting point for many teams. This includes closed captioning support and issues by category.
- Most people track the priority of issues, but how is this determined? Is it the priority set in bug tracking software or determined by automated tools?
- Compliance is a key metric for product managers (VPAT, AODA, and CVAA)
- Employee and customer participation with open source, conferences, training, and beta testing is a key opportunity for metrics.
- Few companies are using Net Promoter Score for customer satisfaction regarding accessibility.
You can begin understanding your company’s progress by harnessing the data within your bug tracking software. This article has more information: Accessibility Metrics and JIRA.
Question 1: Which of the following product quality metrics are you currently tracking? (select all that apply)
Ranked by popularity
- Issues by Priority – 77%
- Issues by WCAG category – 54%
- last time product had a manual evaluation – 46%
- Closed Captioning support – 38%
- Number of accessibility bugs opened per month/year – 31%
- Test coverage for a product – 23%
- Number of accessibility bugs marked as blocked – 23%
- Automated tests pass/fail – 23%
- Number of accessibility bugs closed per month/year – 23%
- Percentage of UI test cases written within a time period/release that test for explicit accessibility requirements (e.g. keyboard, ARIA, markup…) – 15%
- Average time to close new bugs – 8%
- Most common errors detected via automated testing – 8%
- Error density per page (# of issues/size of page) – 0
- Percentage of UI tickets closed within a time period/release that were tagged with accessibility (had accessibility requirements) – 0
Question 2: Which of the following metrics do you track for your company’s accessibility Management (select all that apply)
Ranked by popularity
- AODA compliance – 55%
- Number of media inquiries about product/company accessibility – 45%
- Conference talks by employees on accessibility – 45%
- Product VPAT Coverage – 45%
- Compliance towards 21st Century Communications Act (CVAA) – 36%
- Status of PDF Accessibility – 36%
- Training completion per team/engineer – 27%
- Percentage of company-paid conference attendees attending accessibility conferences – 27%
- Customer empathy training completion – 18%
- Percentage of employees that have self-disclosed as having a disability (section 503) – 18%
- Number of times product developers used a screen reader or other AT 9%
- Open source contributions related to accessibility – 0
Question 3: Customer-focused data metrics
- Percentage of usability participants who have a disability – 63%
- Quantity of feedback via customer support channels – 50%
- Percentage of usability activities including user personas with disabilities – 38%
- Customer support calls by product/technology 38%
- Number of tickets filed by customer support – 38%
- Percentage of beta testers with a disability – 25%
- Customers who use our products with AT – 13%
- Net Promoter Score for customers with a disability – 13%