Lately, the PEAT team has met many people across industry and academia who are enthusiastically driving a new wave of (XR) technologies. This community’s commitment to accessible and inclusive XR solutions is essential, and we were excited to join a recent workshop in Seattle, Washington exploring these issues in depth. Hosted by the World Wide Web Consortium (W3C), the workshop’s goal was to discuss strategies for making XR platforms on the web using principles of inclusive design.
Below, please check out the takeaways we gathered for tackling the unique accessibility challenges of XR—and how accessible XR can increase employment opportunities for people with disabilities.
XR Is Made for Accessibility
Speakers throughout the day conveyed an overarching theme: XR naturally lends itself to inclusive design. Josh O’Connor from the W3C noted that XR can provide “rich, accessible alternatives” that do more than simply convey text on a web page. In fact, XR can offer a full accessible experience with multiple modes of interaction based around the user’s needs and preferences.
Creating XR tools often means blending physical and digital environments, which contains checkpoints for overlaying accessibility features directly onto the world. Common recommendations emerged throughout the day from several speakers, including:
- the importance of building accessible XR hand or eye controllers for people with varied dexterity
- possibilities for plain language guidance to support cognitive accessibility
Several speakers also discussed exciting research on tangible accessibility solutions:
- Meredith Ringel Morris from Microsoft demoed SeeingVR, a set of tools to make virtual reality more accessible to people with low vision.
- Wendy Dannels from the Rochester Institute of Technology presented her research to deliver auditory accessibility using XR.
- Melina Möhlne from IRT showcased her research on how to display subtitles in 360° media.
Building XR with Meaning
During the event, we discussed the guiding question of how to extend existing web accessibility standards to XR platforms. For example, what factors could we apply from the W3C’s Web Content Accessibility Guidelines (WCAG) and the Web Accessibility Initiative’s ARIA standards (Accessible Rich Internet Applications)?
Taking a step back, this question really concerns how to build XR that conveys meaning to people with and without disabilities. ARIA attributes work for websites because they provide labels for specific features like checkboxes and forms. XR spaces are essentially boundless. Because many more types of objects require labeling, participants generally doubted ARIA’s abilities to transfer to XR.
Fortunately, a new file type called glTF could help us assign meaning in XR spaces. Chris Joel from Google presented glTF as a counterpart to jpeg image files. While jpeg is for pictures, glTF is for 3-D objects and scenes. These files carry the ability to add immaculate labels with rich text information that makes them more accessible. This text can be legible to screen readers, and it can provide plain language guidance to aid with cognitive accessibility.
However, questions remained about just how much information to feed the user, as opposed to letting them figure out the 3-D space on their own. Participants wondered how we can make this information production easier for the developers creating glTF files— and of course, what other solutions might exist to build XR with meaning.
How Can We Use XR at Work?
XR includes virtual, augmented, immersive, and mixed reality tools, and the applications are staggering. In the workplace, these XR tools can bolster functions like virtual meetings and online training. XR can also provide a platform for workers to engage more intimately with 3-D models, which span domains like architecture, medicine, engineering, and manufacturing.
PEAT looks forward to continued involvement in making XR more accessible through our partnership with the XR Access Initiative. For more on the topic of accessible XR technologies, check out our key takeaways from the 2019 MAVRIC Conference on Achieving Measurable Results with XR.