XR Accessibility

[Note: In this site, we use the term Extended Reality (XR) as a convenient shorthand referring collectively to the different modalities of Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). ]

How might XR modalities be deployed for teaching and learning in ways that furnish multiple means for representation, action & expression, and engagement while also aligning with our commitment to making user-facing IT environments accessible? 

This site covers some of the potential barriers to using XR faced by people with disabilities or, better put, diverse users across a truly representative range of visual, auditory, motor, and cognitive capacities. We also showcase some of the research and emerging best practices for overcoming these barriers particularly through the support and development of assistive technologies.  Along the way, we consider important process dimensions for organizing analysis of accessibility: content development, assistive technology support, hardware and content selection/acquisition, instructional deployment with accommodation, evaluation cycles and iterative improvements. The UDL framework has something to offer across all these dimensions.

Spotlight: Supporting Visually Impaired XR Users

AR for VIP Project Logo

Sensory and Virtual Mapping for Navigation

Background Reading: Microsoft Research's  Accessible Mixed Reality

Emerging Solution: UC Berkeley School of Information Augmented Reality for Visually Impaired People HoloLens Project

Available athttps://github.com/arvips/AR-for-VIPs-V1

Video PresentationAR for VIPs Demonstration Video (Youtube)

Further Reading: Dylan Fox's Augmented Reality for Visually Impaired People project site with details and project final report (PDF)

Disability Awareness & XR Best Practices


By the Numbers:  CDC on Common Eye Disorders

Best Practice: Provide spatial audio cues for situating and locating within 3D scenes.

ExampleOmniTone is open source implementation of ambisonic decoding and binaural rendering written in Web Audio API.


By the Numbers: NIH on Color Vision Deficiencies  

Best Practice: Provide ways for content creators to check color contrast during development.

Example:  WCAG Contrast Checker free tool for Unity


By the Numbers: NIH Quick Statistics About Hearing

Best Practice: Demonstrate workflow for converting to 3D compatible caption file formats and making 3D caption placements.

ExampleAdding Closed Captions to 360 Video with Adobe Captivate


By the Numbers: NIH on Quick Statistics About Communication Disorders

Best Practice: Allow for Augmentative and Alternative Communication device input to XR systems.

ExampleLetMeTalk is a free app that can run on a user's smart phones or tablet while providing vocalized speech audio input to a primary computer.


By the Numbers: CDC on Difficulties in Physical Functioning

Best Practice: Enable existing VR experiences to be adapted to a user's motion capabilities.

ExampleWalkinVR's WalkinDriver free SteamVR add-on driver enables assistive interface software works with Oculus, HTC Vive, Valve Index, etc.


By the Numbers: ASHA's ID Incidence and Prevalence 

Best Practice: Create guidelines and standards for accomodating users with various cognitive limitations.

Examples:  Portland State University's AASPIRE Web Accessibility Guidelines for Autistic Web Users

This site is maintained by Owen McGrath (Research Teaching & Learning, UC Berkeley). For questions, suggestions, or comments about this page or website, please send email toassistive-tech@berkeley.edu