Accessibility is an important part of any app. Whether you’re developing a new app or improving an existing one, ensure that components are accessible to everyone.
Why develop for accessibility
1 in 5 people will have a disability in their life. – 2010 census
Designing for accessibility benefits blind, low vision, and eyes occupied (driving)
Android includes 4 types of assistive technology:
TalkBack: Screen reader
BrailleBack: Braille output for refreshable braille devices
Switch Access: switch control of device
Voice Access: control device by voice activation: “scroll up”
Android O’s major focus: increase productivity for users
new api additions for accessibility
print disabilities (reading disabilities)
New to TalkBack
accessibilityVolume: adjust audio volumen for accessibiity independently from media. So you can watch youtube and control that separately than talkback. This is available when talkback is on.
Volume from youtube is quieted while talkback is being used. it then fades back into the foreground. There’s a new accessibility volume slider
New gestures for talkback.
If there’s a fingerprint sensor on the back of the device, it can be used by talkback users. The sensor has its own set of customizable gestures. For instance, swipe up on the fingerprint. These can be assigned, such as longpress action
Quickly enable/disable TalkBack
long press the volume keys to quickly turn on/off talkback.
this works on any screen, this makes it easier to test apps and turn off talkback to type information. Press both keys at the same time for a long press and eventually it will turn on/off talkback. the accessibility shortcut can be assigned to switch control, zoom, or other service.
New Text to speech engine can handle multiple languages. Use LocaleSpan to trigger language switching.
2 new APIs
Continuous Gesture API: enable motor impaired users who use head tracker to perform drag and drop, zoom, etc.
A new accessibility button is located in the navigation bar. This allow users to quickly invoke context dependent accessibility features. This sits in the row with back and home buttons
People with dyslexia, low vision, learning new language… They can now use select to speak. part of talkback 5.2. Select element on screen and talkback will read it. It has a floating action button to enable.
In android o. read whole page, and advanced controls, word level highlighting, set up wizard.
manual testing: try your app with TalkBack and SwitchAccess.
if it is ok in talkback, it should be good for brailleback and select to speak
if it works with switch access, it should also work with voice access.
The main problem in IOT secuirty is economics. the cost of building in security vs. cost of risking security. smaller, lower cost devices may not build in security. Exploits have become their own market
How much is the cost to an engineer to create an attack?
How valuable is it?
Can this attack scale enough to be valuable? WiFi injection may be effective, but it doesn’t scale if you have to be near the router. A default credential, i.e. security cams, can be attacked by the thousands.
Does the attack give the person privilege to the hardware, accounts?
Does the attack give the engineer significant persistance? Can it survive a device reboot?
Security Cost: not every company has the resources to build and maintain security features and infrastructure.
Android Things goal: raise attack costs, reduce ROI, and reduce security costs.
OS Hardening: All of Android’s hardening is enabled in Android Things. Permissions, app sandbox, mandatory access control (Selinux), kernel syscall filtering, full ASLR, FORTIFY, stack-protector-strong…
Developer Action: Declare permissions only as needed, split out privileged code.
All Android Things devices will get infrastructure updates directly from Google. This reduces attack persistance and drives down the attack ROI. Updates can be controlled during critical operations, like a drone is flying. Developers can also test updates and request an update to be stopped
They allow Android users to run your apps instantly, without installation. Android users experience what they love about apps—fast and beautiful user interfaces, high performance, and great capabilities—with just a tap.
Android Instant Apps is now open to all developers, so anyone can build and publish an instant app today.
Use the same code to generate the instant app, use feature modules to define what can be included/removed for instant apps. Look for modularize feature in android studio. They also will have optimization tools to make these features faster.
What’s new in Android O keynote…
Picture in picture is coming to Android O. similar to the way YouTube will let the video reduce to a thumbnail while you scroll other videos in a search. Only now the video can be a thumbnail while you open evernote and write notes
Autofill will be included in basic form inputs: text view, edit text.. no extra work required
You can use hints to the auto-fill api to define data types. Autofill apis for custom veiws and opaque hierarchies
Text Stuff: font files can be added to font directory. Downloadable fonts: declare font to be downloaded and cached. Font provider in Google play services v1. There is a beta version available. Access to all 800 google fonts
Auto-sizing Text Views
Fonts with auto-resize will change font size as you resize their container. In the past, the container may grow, but the fonts stay the same size. Now they can grow with the container.
Language detection, accessibility button, separate volume controls, finger print
View.java: public View findViewById (int id);
TextView tv = (TextView) findViewbyId(R.id.mytextview);
View.java: public <T extends view> findViewById(int id)
TextView tv = findViewById (R.id.mytextview);
We will need to worry about Adaptive Icons with future releases. Developers provide background + foreground * mask for multiple devices.
Ask people to pin our app to desktop. ShortcutManager and appWidgetManager. requestPinAppWidget…
Notifications are getting more power in O. The user should always be in control. Users and developers want ability to tweak notifications from an app. Notification channels give developers and users fine grain control. Apps can define channels, assign notifications to channels, post notifications
Note: Once you target O. You must use channels or notifications will be dropped!
Thread policy. unbuffered i/o, VM policy
Media file access
Seekable file descriptors from custom document provider. useful for large remote sources. Cached data. statey below the quota to avoid aggressive deletion. Use storage manager
Lighthouse is being integrated directly into dev tools
Firebase now gives a real time analytics. Cloud functions for Firebase sounds like Akamai for small node functions, like resizing images. But being able to handle these tasks in a global, cached environment.
Google IO: state of the mobile web
Chrome’s mission: move the web platform forward
scroll anchoring: page jumps after content loads after page renders, such as a banner ad. This locks the screen, even when content loads at the top. Scroll anchoring can reduce 3 page jumps per load on average.
AMP: Accelerated Mobile Pages. improve mobile experience.
On average, AMP pages load in less than a second and use 10x less data
LinkedIN found people were 10% more likely to read an article when it is AMP .
amp-bind: merchants can build e-commerce experiences
Progressive web apps: app focused experiences, reliable, fast, engaging.
Twitter Lite is a more accessible, faster and more affordable way for people to use Twitter when they are on slow mobile networks, have expensive data plans and with limited storage on their mobile device.
PWAs can be added to the home screen. Developers can soon control the home page button prompt. They can be displayed in app launcher, android settings, android intents, notifications, and launch as a full-screen immersive view
123B in US alone
paymentRequest – simple web payments within Chrome
paymentRequest can now use more forms of payment. paypal, alipay, samsung pay… could this integrate quickbooks payments?
India’s largest ride sharing app. They are using PWA. over 1m daily rides. 110 cities and 600k drivers. They needed to work with customers that use low cost phones, minimal data plans, and bad connections to reach the entire customer base.
Off line and caching provides faster performance with low data loads. They used polymer for fast web components, Shadow DOM, and HTML import.
They rely heavily on cache, but also have an initial load of 1.3 seconds.
They strategically load components to only get critical elements first.
They are using workbox, they cache these elements for repeated use.
Now, they are only requesting new data, such as transaction information.
They have a 100 score in lighthouse
20% of their PWA bookings come from users that previously uninstalled their Android app
AMP to PWA
<amp-install-serviceworker>. This takes an AMP page and allows it to install a service worker so a user shifts to a PWA when they click. This gives the fast initial page load of an AMP page and prepares the browser to make the second page load just as fast.
Let’s say someone shares a PWA page with a friend. We’d want the friend to have a fast page load, but if they go to a PWA page as the first experience, they won’t get the acceleration via the service worker. So we could do some detection at the page load to see if the service worker is available, if not, move to the AMP experience of that page.
This presentation was created for the CSUN 2017 conference. It introduces several hidden interactions available within Android and iOS. Learn how these work and how to make them accessible. Blue Bell Ice Cream is a classic example still live on the web.
The user must hover over the different images to see what they represent. It uses an image map and lacks alt text.
Android Touch and Hold
A.K.A.: Android’s Right Click or Android Long Press to Add context-speciﬁc menus
Default: Touch and hold
Double tap and hold to long press
This short video shows how you can use the touch and hold/long press to quickly change category or merchant name within the Mint application
onLongClick: Called when a view has been clicked
iOS 3D Touch was introduced on the iPhone 6S. It detects the pressure a person applies to the screen with their ﬁnger. I light touch is seen as a tap. A medium touch will
trigger a peek view. A continued ﬁrm touch will launch the peek’s content into a full screen.
This also allows a person to trigger a shortcut menu on app icons.
Quick glance at relevant information and
Open full content previewed in the Peek
Custom task list from app icon
User Experience: A light press opens a hovering window so you can “Peek” at the content. When you press just a little bit harder, you will “Pop” into the actual content you’d just been
previewing in a Peek.
Alternative actions allow users to quickly make changes without having to open a detail screen. For instance, they can delete a transaction or change an email’s status. The standard interface is to display the options when a user swipes a row to the left. For voiceOver users, the options are announced as alternate actions
It’s Deductible Actions
This short video shows how the alternative actions menu is used in standard mode and how VoiceOver announces the options.
TurboTax uses a custom swipe based navigation between views. It lacks button or suggestions to move back and forth. User testing has showed it to be eﬀective for
sighted users, but required some extra work for accessibility.
Default Experience With VoiceOver
The default experience on Turbo Tax uses a custom swipe gesture that lacks navigation buttons.
TurboTax detects a user’s Screen Reader/Switch Control status to show navigation buttons on Android and iOS
This video shows the default and VoiceOver/SwitchControl experience.
Notice in the standard experience how the screen tracks the user’s ﬁnger movement. This is not a standard swipe gesture, so it will not work with VoiceOver enabled.
We detect VoiceOver and SwitchControl is running to display alternate back and continue buttons
Animated transition between views
Next and Back ﬂow with every screen
Eliminates navigation buttons
No buttons? Accessibility?
Have I reached the end of the screen?
Instead of a basic swipe gesture, this interface tracks the movement of the ﬁnger across the screen. This allows the animation to match the user’s ﬁnger speed for a more
However, the ﬁnger touch is intercepted by VoiceOver, so the custom navigation does not work when VoiceOver is enabled.
How can we refactor code to detect any
accessibility related settings and address them
Helper function to the rescue!
NSNotiﬁcationCenter adds observers to track any
settings that may require us to show buttons.
This is an OR logic. Example – if voice over OR
switch control status changed, display buttons.
Boolean is assigned a value – true if buttons need to be shown.