This presentation was created for the CSUN 2017 conference. It introduces several hidden interactions available within Android and iOS. Learn how these work and how to make them accessible. Blue Bell Ice Cream is a classic example still live on the web.
The user must hover over the different images to see what they represent. It uses an image map and lacks alt text.
Android Touch and Hold
A.K.A.: Android’s Right Click or Android Long Press to Add context-speciﬁc menus
Default: Touch and hold
Double tap and hold to long press
This short video shows how you can use the touch and hold/long press to quickly change category or merchant name within the Mint application
onLongClick: Called when a view has been clicked
iOS 3D Touch was introduced on the iPhone 6S. It detects the pressure a person applies to the screen with their ﬁnger. I light touch is seen as a tap. A medium touch will
trigger a peek view. A continued ﬁrm touch will launch the peek’s content into a full screen.
This also allows a person to trigger a shortcut menu on app icons.
Quick glance at relevant information and
Open full content previewed in the Peek
Custom task list from app icon
User Experience: A light press opens a hovering window so you can “Peek” at the content. When you press just a little bit harder, you will “Pop” into the actual content you’d just been
previewing in a Peek.
Alternative actions allow users to quickly make changes without having to open a detail screen. For instance, they can delete a transaction or change an email’s status. The standard interface is to display the options when a user swipes a row to the left. For voiceOver users, the options are announced as alternate actions
It’s Deductible Actions
This short video shows how the alternative actions menu is used in standard mode and how VoiceOver announces the options.
TurboTax uses a custom swipe based navigation between views. It lacks button or suggestions to move back and forth. User testing has showed it to be eﬀective for
sighted users, but required some extra work for accessibility.
Default Experience With VoiceOver
The default experience on Turbo Tax uses a custom swipe gesture that lacks navigation buttons.
TurboTax detects a user’s Screen Reader/Switch Control status to show navigation buttons on Android and iOS
This video shows the default and VoiceOver/SwitchControl experience.
Notice in the standard experience how the screen tracks the user’s ﬁnger movement. This is not a standard swipe gesture, so it will not work with VoiceOver enabled.
We detect VoiceOver and SwitchControl is running to display alternate back and continue buttons
Animated transition between views
Next and Back ﬂow with every screen
Eliminates navigation buttons
No buttons? Accessibility?
Have I reached the end of the screen?
Instead of a basic swipe gesture, this interface tracks the movement of the ﬁnger across the screen. This allows the animation to match the user’s ﬁnger speed for a more
However, the ﬁnger touch is intercepted by VoiceOver, so the custom navigation does not work when VoiceOver is enabled.
How can we refactor code to detect any
accessibility related settings and address them
Helper function to the rescue!
NSNotiﬁcationCenter adds observers to track any
settings that may require us to show buttons.
This is an OR logic. Example – if voice over OR
switch control status changed, display buttons.
Boolean is assigned a value – true if buttons need to be shown.
In preparation for an upcoming talk at CSUN 2017 on data metrics for accessibility, I created a survey for fellow accessibility managers to share what they are doing to quantify the accessibility of their products and services. The following is the raw survey results and I will continue working with the data and responses as we prepare for the presentation: Accessibility Data Metrics and Reporting – Industry Best Practices
Some Initial Observations:
Some of the questions were confusing. This is especially true when asking about percentages, as we may know the number of events, but could never compare them to an unknown larger set.
WCAG conformance is the starting point for many teams. This includes closed captioning support and issues by category.
Most people track the priority of issues, but how is this determined? Is it the priority set in bug tracking software or determined by automated tools?
Compliance is a key metric for product managers (VPAT, AODA, and CVAA)
Employee and customer participation with open source, conferences, training, and beta testing is a key opportunity for metrics.
Few companies are using Net Promoter Score for customer satisfaction regarding accessibility.
You can begin understanding your company’s progress by harnessing the data within your bug tracking software. This article has more information: Accessibility Metrics and JIRA.
Question 1: Which of the following product quality metrics are you currently tracking? (select all that apply)
Ranked by popularity
Issues by Priority – 77%
Issues by WCAG category – 54%
last time product had a manual evaluation – 46%
Closed Captioning support – 38%
Number of accessibility bugs opened per month/year – 31%
Test coverage for a product – 23%
Number of accessibility bugs marked as blocked – 23%
Automated tests pass/fail – 23%
Number of accessibility bugs closed per month/year – 23%
Percentage of UI test cases written within a time period/release that test for explicit accessibility requirements (e.g. keyboard, ARIA, markup…) – 15%
Average time to close new bugs – 8%
Most common errors detected via automated testing – 8%
Error density per page (# of issues/size of page) – 0
Percentage of UI tickets closed within a time period/release that were tagged with accessibility (had accessibility requirements) – 0
Question 2: Which of the following metrics do you track for your company’s accessibility Management (select all that apply)
Ranked by popularity
AODA compliance – 55%
Number of media inquiries about product/company accessibility – 45%
Conference talks by employees on accessibility – 45%