Accessibility Data Metrics Survey Results

Survey PointIn preparation for an upcoming talk at CSUN 2017 on data metrics for accessibility, I created a survey for fellow accessibility managers to share what they are doing to quantify the accessibility of their products and services. The following is the raw survey results and I will continue working with the data and responses as we prepare for the presentation: Accessibility Data Metrics and Reporting – Industry Best Practices

Some Initial Observations:

  • Some of the questions were confusing. This is especially true when asking about percentages, as we may know the number of events, but could never compare them to an unknown larger set.
  • WCAG conformance is the starting point for many teams. This includes closed captioning support and issues by category.
  • Most people track the priority of issues, but how is this determined? Is it the priority set in bug tracking software or determined by automated tools?
  • Compliance is a key metric for product managers (VPAT, AODA, and CVAA)
  • Employee and customer participation with open source, conferences, training, and beta testing is a key opportunity for metrics.
  • Few companies are using Net Promoter Score for customer satisfaction regarding accessibility.

You can begin understanding your company’s progress by harnessing the data within your bug tracking software. This article has more information: Accessibility Metrics and JIRA.

Question 1:  Which of the following product quality metrics are you currently tracking? (select all that apply)

Ranked by popularity

  1. Issues by Priority – 77%
  2. Issues by WCAG category – 54%
  3. last time product had a manual evaluation – 46%
  4. Closed Captioning support – 38%
  5. Number of accessibility bugs opened per month/year – 31%
  6. Test coverage for a product – 23%
  7. Number of accessibility bugs marked as blocked – 23%
  8. Automated tests pass/fail – 23%
  9. Number of accessibility bugs closed per month/year – 23%
  10. Percentage of UI test cases written within a time period/release that test for explicit accessibility requirements (e.g. keyboard, ARIA,  markup…) – 15%
  11. Average time to close new bugs – 8%
  12. Most common errors detected via automated testing – 8%
  13. Error density per page (# of issues/size of page) – 0
  14. Percentage of UI tickets closed within a time period/release that were tagged with accessibility (had accessibility requirements) – 0

Question 2: Which of the following metrics do you track for your company’s accessibility Management (select all that apply)

Ranked by popularity

  1. AODA compliance – 55%
  2. Number of media inquiries about product/company accessibility – 45%
  3. Conference talks by employees on accessibility – 45%
  4. Product VPAT Coverage – 45%
  5. Compliance towards 21st Century Communications Act (CVAA) – 36%
  6. Status of PDF Accessibility – 36%
  7. Training completion per team/engineer – 27%
  8. Percentage of company-paid conference attendees attending accessibility conferences – 27%
  9. Customer empathy training completion – 18%
  10. Percentage of employees that have self-disclosed as having a disability (section 503) – 18%
  11. Number of times product developers used a screen reader or other AT 9%
  12. Open source contributions related to accessibility – 0

Question 3: Customer-focused data metrics

  1. Percentage of usability participants who have a disability – 63%
  2. Quantity of feedback via customer support channels – 50%
  3. Percentage of usability activities including user personas with disabilities – 38%
  4. Customer support calls by product/technology 38%
  5. Number of tickets filed by customer support – 38%
  6. Percentage of beta testers with a disability – 25%
  7. Customers who use our products with AT – 13%
  8. Net Promoter Score for customers with a disability – 13%


Accessibility metrics and JIRA

Many companies use JIRA for bug-tracking and JIRA’s query language can help you pull out some great metrics for managing your company’s product accessibility.

Needle in the Haystack

The first key to JIRA tracking is to understand how to find accessibility issues. Most companies will use the term accessibility to reference data access, servers offline, setting up employee security access, etc. You cannot depend on a text search for accessibility. Further, your company may have a accessibility option within the worktype selector. This also will be full of unusable tickets.

With this in mind, it’s best to look at alternative methods for defining which tickets are for accessibility. At Intuit, we add “accessibility” to the labels field on appropriate issues. This has worked well, but unfortunately is dependent on people adding the issue. I will also do regular searches of JIRA for “aria-“, “keyboard accessibility”, “VoiceOver”, “JAWS”, and similar queries to find issues that lack the accessibility label. You can now begin using the below JQL queries to surface your issues once you have this label established.

JIRA Query Language (JQL) Queries

I am using the following queries to build an accessibility dashboard within JIRA. This has become a tremendous asset for understanding the activity and breadth of changes.

All bugs with accessibility label
This is your base query. This will pull up all issues that have the accessibility label. The data is raw and we’ll begin fine tuning it below
Find issues that need the accessibility label
text ~ “aria-” and labels != accessibility
I’m using “aria-” in this text search and hiding tickets that already have the accessibility label. This will allow you to quickly go through your tickets and start adding the label. Switch up the query to include other accessibility terms
All open accessibility bugs
labels=accessibility and status=Open
We will use the status field often. Status options may vary if your company has customized the set. Common options are:

  • Open
  • In Progress
  • Blocked
  • Integration
  • Reopened
  • Closed
Blocked and Won’t Fix
labels=accessibility and status = blocked
labels=accessibility and resolution = “Won’t Fix”
These are important issues and I would suggest filtering them within the last year or similar time period to make sure the new issues stand out and not legacy issues. These are the tickets that need your attention!
All closed accessibility bugs
labels=accessibility and status=Closed
accessibility bugs closed within the last 4 weeks
labels=accessibility and status=Closed and resolved > “-4w”
Now we are starting to look at the performance of our engineers. This query is a nice one to see the current rate of issue completion. This is saying the resolved date is less than four weeks ago. I like to add this to a dashboard as a piechart by project.
accessibility bugs closed since beginning of FY17
labels=accessibility and status=Closed and resolved > “2016/08/01”
Intuit’s fiscal year starts on August 1. This query allows you to see how many issues have been closed since a particular date. You may change this to January 1, 2016 or similar date.
accessibility bugs that are open and have been updated in FY17
labels=accessibility and status =Open and updated > “2016/08/01”
This will let you know which bugs have been updated since a particular date. Updated may mean the issue was reassigned, a new comment was added, it was added to a sprint, or other touches. This is good to see which issues are still alive.
accessibility bugs that are not open or closed
labels=accessibility and status in (“In Progress”, New, Verify) ORDER BY status DESC
The middle zone between Open and Closed can be very interesting. This query will also sort the results by status and start with Verify (descending alphabeticallly).

This post will be updated as I create new queries and please share your favorite JIRA queries in the comments below.

Creating Dashboards

I’ll do a post later on making dashboards, but I wanted to share a lesson I learned the hard way. JIRA allows you to create custom dashboards and you can add these filters to that dashboard to watch progress with nice charts and tables. You can also share that dashboard with other people. However, you MUST save the query as a filter and then share that filter before it can be viewed by other people on your shared dashboard. It’s a bit backwards and confusing, but the takeaway is to not save a search query to your dashboard without first saving it as a filter.