XR Accessibility – Learning from the past and addressing real user needs for inclusive immersive environments


XR is an acronym used to refer to the spectrum of hardware, applications, and techniques used for virtual reality or immersive environments, augmented or mixed reality and other related technologies. Building inclusive XR experiences presents many challenges. While awareness of accessibility is improving over time in 2D space, little is understood about the needs of people with disabilities in XR. Some of the challenges include the use of assistive technology and interoperability, complex input devices, control schemes that require a high degree of precision, timing and simultaneous action and more.

Designers and developers working in XR need to be able to understand and support various input and output modalities such as text to speech, switch or gestural interfaces and more. W3C are exploring these challenges and are raising awareness of diverse user needs. The Research Questions Task Force (RQTF) have published XR Accessibility User Requirements. This working group note is cutting edge work that lists user needs and requirements for people with disabilities when using XR.

Other related work in RQTF has exploring XR Accessibility Architecture including the potential of Object Orientated semantics, the application of semantic scene graphs and well as the question of how XR authoring tools can support accessibility.

There are current initiatives addressing XR Accessibility with a series of workshops. W3C recently ran an 'Inclusive Design for Immersive Web standards workshop' that took place in Seattle last year. This built on previous workshops on Virtual Reality (VR) in 2016 and VR Authoring in 2017. Also the XR Access Initiative are holding a series of accessibility related workshops and symposia that aims to bring together researchers, technologists, and advocates to focus attention on this space.

At the recent Inclusive Design for Immersive Web standards workshop the goals were to:

  • Share existing inclusive XR solutions to help create new standards for inclusive XR on the web.
  • Identify accessibility gaps in existing web XR technology and consider solutions for closing those gaps.
  • Explore ways to use existing technologies and standards to create innovative solutions for inclusive XR on the web.

Some of the breakout sessions explored:

  • Accessibility hooks for graphical 3D environments.
  • Making Motricity accessible in XR.
  • Auditory accessibility in 3D environments
  • Assistive technologies for XR

For an overview of the findings you can read the recent report from the Inclusive Design for Immersive Web Standards workshop.

We aim to build on these efforts at this ICCHP thematic session and are interested in paper submissions if you are working in this, or a related space.

Topics of interest include:

  • Games accessibility
  • Geospatial data and mapping
  • Spatial computing accessibility
  • Sensor enabled navigation
  • Way-finding services
  • Semantic scene graphs
  • Object orientated semantics in XR
  • Personalisation

Some general questions to consider are:

  • What are the needs of people with disabilities in XR?
  • How can we practically build XR experiences that meet these user needs?
  • Can we learn from previous accessibility initiatives in this space?
  • How can we best leverage the current technology landscape to build inclusive futures?

This ICCHP Special Thematic Session (STS) therefore invites papers on XR accessibility, with particular emphasis on the topics mentioned above. Papers on other related topics are welcome too.


Contributions to a STS have to be submitted using the standard submission procedures of ICCHP.
When submitting your contribution please make sure to select the right STS under "Special Thematic Session". Contributions to a STS are evaluated by the Programme Committee of ICCHP and by the chair(s) of the STS. Please get in contact with the STS chairs for discussing your involvement and pre-evaluation of your contribution. Submission Deadline for Contributions to STSs: April 15, 2020