Find aria-hidden with this bookmarklet

Use this bookmarklet to find aria-hidden attributes on your page. Continue Reading Find aria-hidden with this bookmarklet

I love purpose built bookmarklets that help you find problematic code. I got an email yesterday from Travis Roth about a potential vestigial aria-hidden attribute on an otherwise visible element. Unfortunately, it’s not uncommon to find an element that has aria-hidden=”true” on an element that is visible and should have either “false” or no aria-hidden attribute. This causes assistive technology to ignore the element.

My first reaction was to search the code for aria-hidden attributes, but this can take time and would have to be completed on each page to find the issue.

So I created the following bookmarklet that will find any element on your page that uses aria-hidden. It will force it to be visible and will display the attribute’s value.

screenshot showing the bookmarklets effect on hidden elements

To use this bookmarklet, drag the following link to your bookmark toolbar. Visit your questionable page and click the link.

aria-hidden bookmarklet
Continue Reading Find aria-hidden with this bookmarklet

Web4All Conference Notes – Day 3

Crowd sourcing accessibility evaluations

2013-6: 350 government web sites and 2,000 non-government sites have been evaluated for accessibly in China

conformance testing included

  • Automatic Evaluation
  • Manual Assessment

Crowd sourcing can integrate the power of crowds to solve the manual assessment bottleneck

It was proposed in 2006 and has been used in reCaptcha, spoken wikipedia, labeling.

the current crowd sourcing is not suitable for web accessibility because the assessment tasks require a high level of expertise and experience.

There was an assignment of tasks. The results were compared to

  • total work
  • time out
  • give up
  • errors detected.

An algorithm was developed to compare these values to determine a cost model. This allows them to look at historical data to find that a person is more efficient at one of the rulesets. For instance a completely blind person may be great at form labels but not at color contrast.

Assessment of semantic taxonomies for blind indoor navigation based on a shopping center use case

Location-based services (LBS)
Untitled

  • many LBS are available  thanks to smart phones
  • provide turn by turn navigation support using vocal instructions
  • we know little about what environmental elements an features are useful, such as tactile paving or braille buttons

The did a survey of taxonomies

Looking at these data sets, they created a simplified taxonomy based on their similarities

  • Pathways
  • doorways
  • elevators
  • venues
  • obstacles (not included in the previous taxonomies)

These elements defined by their fixed positions within floor map. Vocal instructions use this information to generate vocal instructions. Locate tactile paving:

  • “proceed 9 meters on braille blocks, and turn right”
  • “proceed 20 meters, there are obstacles on both sides”

Announcements of obstacles and tactile paving was confusing and unnecessary for one guide dog user.

Do web users with autism experience barriers when searching for information within web pages

The study looked at eye gazing to see if there was a difference between two groups: with and without autism.

With a series of search tasks, the group with autism had less success than the control group for completing the tasks.

tracking the eye gaze. Five elements: a, b, c, d, e. Their eye map could be a-b-c-e-d

Check the variance between the two groups.

DysMusic

The #DysMusic study is creating a language independent test for detecting #dyslexia in children. #w4a2017 @luzrello

Most dyslexia detection tools are still linguistics based, which isn’t appropriate until the child is already 7-12 years old. This study tries to find a detection method that is non-language based, this would allow detection at a much younger age.

There is a memory game with music elements.

Tasks

  • Find the matching sounds
  • distinguish between sounds
  • short time interval perception

Raw sound is modified via frequency, length, rise time, rhythm.  Only one property is modified at a time. People with dyslexia tend to have trouble detecting rise time changes.

Accessibility Challenge

Producing Accessible Statistics Diagrams in R

Data visualization is increasingly important. R is an existing language for statistics. Jonathan (co-writer) had been using R to output printed diagrams of statistics. They worked together to convert R into an accessible SVG format

Histograms and Boxplots were discrete data presentations  targeted layout for the initial project. Time series and scatter plots are continuous data graphs

Extract the important data points, convert it to an xml document, and attach this to the SVG. The final experience provide easy navigation (arrow keys), supports screen readers via aria live regions.

GazeTheWeb

GazeTheWeb is a simplified browser designed for eye tracking navigation. #w4a2017 #a11y

Math Melodies

Math Melodies makes math easier to learn for children that are blind or low-vision. Math exercises as puzzles, audio icon maps, different exercises. It was funded via crowdfunding and has been downloaded 1400 times

NavCog

NavCog is a navigation project from CMU for blind individuals. It uses low energy blue tooth beacons.

Installation of the beacons is not scalable across large areas. To crowd source the task, they created a set of instructions to walk through the process of configuring and installing the beacons.

LuzDeploy

LuzDeploy is a Facebook messenger bot: easy to use.

VizLens:

VizLens is a crowd sourced interpretation of interfaces, such as microwave oven. Multiple volunteers are recruited to generate labels for the interface. the app then uses augmented reality to virtually overlay the labels.

Chatty Books

Chatty Books is an html5 + Daisy reader that creates an audio version of documents. It can now convert from pdf to multimedia Daisy.

  1. PDF – NiftyReader (text)
  2. export to multimedia daisy or epub3
  3. drag and drop to chatty books, the daisy player and library
  4. upload daisy content to chatty books service (cloud) and use chatty books app on iPad

Able to read my mail

Simplified email program for people with learning and intellectual disabilities. Gmail plugin that converts to simplified text or pictograms.

Closed ASL Interpreting for online videos

Created a framework for incorporating an interpreter. Closed Interpreting, instead of Closed captioning.

the interpreter window needs to be flexible to allow the user to move it around and change size to reduce distractions. IT’s closed, so i can be turned on/off

Moving the eyes back and forth for long periods of time can be exhausting. so the window can be moved to be closer to the screen’s content.

eye-gaze tracking to pause the video when looking away from the video.

Closed Interpreting [CI]

Provide a video interface that allows closed interpreting, like closed captioning. The interface provides a second screen that includes an ASL interpreter

The users appreciated the ability to customize the interpreter’s location. They also liked the ability to pause the interpreter as the gaze moved from content to the interpreter

Difference between :root and html

The :root selector targets the highest level parent, which would be the <html> tag in an HTML document. The :root has a higher specificity, as it is a pseudo-class instead of a plain element.

CSS-Tricks has a great description on this: :root by Sarah Cope.

In this example, the background of the page would be red, as :root is more specific than html.

:root {background:red;}
html {background:green;}

The :root selector is supported across all major browsers.