Emerging technologies, including artificial intelligence (AI) and virtual and augmented reality (XR), will shape the workplace of the future. They will change how organizations hire, train, and support employees, including those with disabilities. In this webinar, PEAT’s Co-director, Bill Curtis-Davidson, interviews accessible technology experts to explore how employers and HR professionals can help plan for an accessible workplace of the future, including the procurement of accessible technologies and the potential accommodation needs of people with disabilities.

This presentation originally aired on May 20, 2021.


Anne E. HirshAnne E. Hirsh
Associate Director
Job Accommodation Network

Bill Curtis-DavidsonBill Curtis-Davidson
Partnership on Employment & Accessible Technology

Martez Mott, Ph.D.Martez Mott, Ph.D.
Senior Researcher
Ability Group, Microsoft Research

Tina Park, Ph.D.Tina Park, Ph.D.
Methods for Inclusion Research Fellow
at the Partnership on AI


Download Transcript (.docx)
Download Presentation Slides (.pptx)



Hello, everyone, and thank you for joining us today for this special webinar. Today we’re excited to welcome you to our webinar on “Accessibility & Accommodations, How Employers and HR Professionals Can Prepare for Emerging Tech in the Workplace.”

This session is presented by JAN and the Partnership on Employment and Accessible Technology or PEAT. Both JAN and PEAT are funded by contracts with the Office of Disability Employment Policy or ODEP at the U.S. Department of Labor. We have three great speakers for you today: Bill Curtis-Davidson, co-director of PEAT; Dr. Martez Mott, postdoctoral researcher at Microsoft Research; and Tina Park, Methods for Inclusion research fellow at the Partnership on AI.

Before we begin, we have just a few housekeeping items to go over. First, if you experience technical difficulties during the training, please use the question and answer option at the bottom of your screen or use the live chat option at AskJAN.org.

Second, questions may be submitted during the training by using the question and answer option located in the bottom of your screen. Questions will be gathered into a queue, so don’t worry if you see your question was deleted or dismissed.

To access the PowerPoint slides, use the link included in the chat or download them from the webcast archive on AskJAN.org.

To access captioning use the closed captioning option at the bottom of the webcast window. If preferred you can view captions through your browser using the StreamText website. The link is available in your login email and will be provided via the chat window. A copy of the transcript will be provided with the archived webcast.

This training is being recorded and will be available on the AskJAN.org website as well as PEAT’s website. And finally, at the end of the training, an evaluation will automatically pop up on your window – screen in another window. We really appreciate your feedback, so please stay logged on to complete the evaluation.

Now let’s get started with our training.

As we welcome you, we want to say happy Global Accessibility Awareness Day or GAAD. This webinar is actually happening on the 10th Anniversary of GAAD. The purpose of GAAD is to get everyone talking, thinking, and learning about digital access and inclusion Of course today our focus will be on employment and accommodations. I know many of you are familiar with JAN, but just in case, the main crux of our service is to provide one-on-one free confidential consultation on all things related to workplace accommodations. We work with all industries, types of jobs and disabilities or medical conditions. Giving you targeted technical assistance, not only on accommodation options but also including information related to what the ADA or Rehabilitation Act might say to the situation described. Our work covers the entire life cycle of employment from the application stage, interview, onboarding, return to work, stay at work, and yes, promotion within your organization.

As technologies change and new ones are developed, incorporating and understanding accessibility and accommodations are key to successful employment and maintaining productivity of all workers. So I am excited to learn more on this topic with you today.

Now let me turn it over to Bill Curtis-Davidson at PEAT.  Welcome, Bill.


Thank you, Anne, we’re delighted to be here today, PEAT is delighted to partner with JAN on this important webinar in celebration of GAAD.

PEAT is — I’ll tell you a little bit about PEAT’s mission first.  PEAT is working to build the future of work. And we’re doing so by fostering collaborations in the technology space that build inclusive workplaces for people with disabilities. And our vision is a future where new and emerging technologies such as those you’ll hear about today are accessible to the workforce by design. And you can learn more about PEAT by going to our website at PEATworks.org.

First I would like to talk just briefly at the next slide about the connection between inclusive tech and employment. Research is showing that 33.2% of adult jobseekers with disabilities are employed, compared to 76.6% of those without disabilities. And this is a staggering gap in workforce participation that inclusive technologies can help close. In 2020, the move to virtual work accelerated rapidly.

In February of this year, the McKinsey Global Institute released a report titled “The Future of Work after COVID-19. And in that report they found that the adoption of digital technology sped up by several years in just eight months. And many of those changes are likely permanent. Tools behind the virtual workplace are increasingly powered by new and emerging technologies like AI and extended reality.

On the next slide I’ll talk a little bit about what is extended reality? Extended reality is an umbrella term, comprising new kinds of spatial technologies and user interfaces. And you can think of it as including three main kinds of technology experiences.

First, virtual reality, which are fully digitally generated immersive experiences accessed via head-mounted displays or via web browsers. And here I’m showing an image of a virtual reality meeting where avatars of real people are conversing and presenting in a digital room. And they are sharing a presentation on a virtual screen.

Second, augmented reality or AR are digitally augmented experiences that are rendered as overlays on the user’s real-world field of view. Usually these are accessed by head-mounted displays.  They can also be via a mobile app. And here I’m showing a man wearing an augmented reality device, and he’s getting remote assistance and training and instructions while working on equipment.

Finally, mixed reality is where the physical and digital worlds are merged to produce experiences where physical and digital objects and humans co-exist in three-dimensional space. And here I’m showing a person working in a physical office who is conversing with a colleague represented as a hologram or a 3D video. And they are working together on shared documents that are digitally placed in the environment that they both share.

So on the next slide we’ll talk a little bit about what does accessibility mean in XR? We can look at accessibility in two different ways. First, this first part of the slide is going to talk about the core hardware and software that comprises these technologies such as the operating systems. Making sure that people can access the baseline systems. And then if you build the slide, the next part of the slide will talk about the content that’s built on these devices.

If you could please advance the slide.

As well as assistive technologies, new kinds of assistive technologies, that can be built using the capabilities of these powerful technologies. And then finally, if you advance the slide one more time, we’ll see that in industry collaborations, we’re encouraging inclusive co-design involving people with disabilities so that we can build these technologies to be more usable by everyone and advance the overall possibilities of them.

On the next slide, we’ll talk about AI, our second topic area for today. And we’ll talk a little bit about AI and how prevalent it is in the workplace, and you’ll hear about this in a moment when we speak to Dr. Tina Park. AI is already used in future and current workplaces for job searching and finding work, recruiting talent, screening, interviewing, and hiring people. Onboarding, training, and development. Physical site navigation. Communication, collaboration, and productivity. And new kinds of personalized accommodations and assistive technologies. As well as health, wellness, and safety.  Optimizing work and performance and promotions. And then finally, digital assistants. Some of you may have used a chatbot or some kind of assistant that is really powered by AI. And when we think about AI and accessibility on the next slide, leading researchers such as Meredith Ringel Morris have been exploring AI fairness for people with disabilities. And in an important research paper that she published, Dr. Morris outlined seven key challenges such as inclusive design, bias prevention, data privacy, errors and trust, setting proper expectations of where these technologies work well and maybe don’t work so well yet, simulated versus inclusive datasets that use real data about people with disabilities to build better systems. And then finally, last but not least, social acceptability. What is acceptable use of AI given the constraints we have today.

And with that, I’m going to introduce our first speaker, Dr. Martez Mott, senior researcher at Microsoft. And we’re going to hear some perspective from him on XR.

[Interview with Dr. Martez Mott]

So Martez, I’m so glad you’re here with us today to share your perspective on XR in the workplace.  Can you start out by just introducing yourself?


Yeah, of course, and thank you so much, Bill, for having me. It’s really great to be here with everyone else So my name’s Martez Mott. I’m a senior researcher in the Ability Group at Microsoft Research in Redmond, and I primarily study human-computer interaction and accessibility, and my focus at Microsoft Research has been trying to investigate how to improve the accessibility of different types of mixed reality systems.


Oh, that’s great.  I’m sure your job is really, really interesting working with the team, the challenge and team there and with all of your colleagues in the industry. So I wonder maybe when we start out what we could look at is how have you worked with people with disabilities in your research? Like, give us some sense of what you’ve been learning as you’ve worked with people with disabilities in this new area of technology, and how would you describe some challenges that people with disabilities might experience when they’re using XR?


Yeah, of course, so I primarily work with people with physical disabilities.  So this might be people with conditions such as cerebral palsy, muscular dystrophy, Parkinson’s, people who have strokes or amputations, things like that. And what we have found so far in our research that we’ve been doing really over the past two years or so is try to identify what barriers might people encounter when they’re using different types of XR systems. So, you know, a lot of this is very human-centered research. Really talking to people.  Trying to understand their perspectives. Getting insights from them about what they actually want to see when these systems are being designed and improved upon.

So in our research that we’ve done so far, we’ve really been able to identify and try to kind of think about what these barriers look like. So for example, people with physical disabilities might actually have difficulties just accessing the hardware that’s really common in different XR systems. So an example of that could be the head-mounted display that’s really common for augmented reality systems like Microsoft’s HoloLens or virtual reality systems like the Oculus Quest, things like that. These systems require people to kind of have a certain amount of movement and dexterity in their heads, their necks, their arms, their hands and fingers, and things like that. And that can be really difficult. So you can imagine just the difficulty of saying, “Hey, you need to wear this HMD to do some type of task at work,” but if a person has difficulty, let’s say, lifting their arms above their shoulders, then, you know, just putting on and taking off the HMD within itself could be an accessibility barrier. If you have these kind of specialized controllers that give people really great tracking capabilities, but if you expect people to be able to kind of use two hands in a very fluid way to be able to interact with different type of virtual content, whether it be a hologram in the environment or some kind of virtual content in a virtual environment, that can be really difficult for people, so it’s really trying to understand kind of what these barriers are and how we can overcome them.

And while I’ve primarily looked at people with physical disabilities, you know, colleagues and other researchers have found other problems with people and with other disabilities. So for example, if a person is blind or has low vision, it can be really difficult to get access to some of this virtual content. So if you have a hologram that’s placed somewhere inside the physical environment, how can you provide people who might be blind or have low vision access so they know what that hologram is and how that hologram is interacting with other things inside the environment?


Yeah, and when you bring up holograms, I think objects but also 3D video of people, right, other people themselves. So it’s not just digital objects that are inanimate, if you will, or models.  But it could be a representation of a colleague that you’re meeting with; right?  In hologram form.


Yeah, exactly.  Yeah, I think there is this expectation that as this technology improves, we’re going to go from kind of like these less-static representations, which might be like, “Oh, here’s some virtual content I’ve just placed in the environment for people to see,” to, like you mentioned, more interactive so like, “Oh, there’s a person, a colleague, that’s located in another country, but I’m seeing a representation of them inside of my physical space or similarly they are seeing a representation of me in their physical pace, and we’re trying to interact perhaps on shared either virtual or digital content.” So it could be something like, “Hey, let’s look at this model that I just built.  And we’re showcasing it inside the physical space. How can people both who are present and non-present have kind of access to that shared content?”

And then, like you said, with 360 video and these other different types of new content that exist, there could be these accessibility challenges or barriers that could pop up there.  And I think a lot of people are thinking about ways to improve the accessibility of all these different forms and representation that could come about.


Yeah, and I think, for our audience, like, some of the stuff may seem like we’re in Space Age kind of, this is far off, but it’s really not.  It actually already is in products, right, such as the Microsoft HoloLens and other products. You know, most of the larger platforms are looking at this type of area of immersive meetings, those type of things. And in the communication that this will enable is really fascinating to think about. And also a huge challenge area for making sure that people with disabilities are not excluded from these immersive hybrid experiences.

So maybe that’s a good segue into the next question I had for you, which is, like, you know, if we look at the workplace, you know, I just mentioned immersive meetings. I know that in, especially post-COVID, hybrid workplaces, I think companies are looking for ways, “How do we get people connected? What are some of the big, kind of, use cases you see that are going to maybe have the first traction, if you will?  And where we might start seeing these technologies happening?”


Yeah, of course, I think you hit on the big one, this kind of hybrid meeting.  Not just in the workplace, you’re seeing it in education, with schools, with students being remote and different types of situations like that. So I think, you know, for example if we take hybrid meetings as one example, you might think of it as saying, like, “Okay. We want people to be able to meet with one another and share both virtual and physical spaces.  But how do you do that in an accessible way?”  Right?

So you have to think about the abilities of your employees and the situations they are in and that the technologies that you are providing them that allows them to kind of have equal footing. You don’t want these technologies to create barriers or second-class citizens maybe.  Right? You don’t want something which is like, “Hey, we want people to have these opportunities,” but because they are inaccessible people get relegated to kind of less-exciting technology. So they can only interact with the group through maybe a chat interface instead of different types of interfaces that might be a little bit more immersive and could actually add a little bit more flavor to the meeting. Things like that.

Another example of that could be, like, remote assistance. So if you can imagine that these devices that we’re using can extend capabilities in a lot of different ways. They give us the opportunity, for example, to have cameras and, you know, microphones. And that will allow people to say, like, “Hey, I’m providing some type of service, or I’m doing some type of test, and I need another person, for example, to give me some recommendations on what to do next or how should I proceed given where I am.”

And you want these technologies to be accessible to people.  Because if they are not, you can just imagine that whatever job classification that relies on these technologies, if they’re inaccessible, then people with disabilities won’t be able to get into those jobs and won’t be able to have those same type of opportunities.

So I think there’s going to be a lot of range of different types of application and use cases that, if they’re not accessible, we might be inadvertently keeping people away from these terrific job opportunities.


Right. And I think you just made another element of the case for why we need so desperately to do the work that you and others are contributing to. Which is to look at the needs of people with disabilities and also design better technologies.  And we designed for people with disabilities, right, who may have, for example, someone who is deaf or hard of hearing, then we might design better tech for an immersive meeting where everybody is more quiet.  Or it’s in a loud environment where people are still trying to meet, but they really can’t rely on audio.

And I applaud the work of Microsoft and other companies who really have taken inclusive design to heart.  And so that’s one thing that we always remind our audience of certainly with PEAT.

I know your work is also involved a little bit in the area of unique and novel forms of assistive technology. So again, a lot of our audience would be familiar with the workplace accommodations like a screenreader or different kinds of accommodations or assistive tech that are used. So can you tell us just a little bit shortly about, you know, some examples of assistive tech that XR is enabling?


Yeah, so, like I briefly mentioned, you know, XR really gives us the opportunity to kind of like augment human abilities a lot of interesting ways.  So, to your point, perhaps people who are deaf or hard of hearing, you could imagine that people could have these devices that they wear that can provide them, let’s say, visual cues of audio information. So you can imagine things like broadcast information, a fire alarm is going off, or something is being announced on a PA system. If that information may have been inaccessible to a person before, now if they have some type of XR device, you know, that they are using, this device could provide them visual cues to let them know, like, “Oh. a broadcast is happening,” and can provide captions, for example, of what that broadcast is.

Similarly for people who might have low vision you can use cameras on these XR systems to provide increased object recognition, so you can have something where you can tell people, like, “Hey, we can use the cameras to identify certain objects,” and then use the, kind of, heads-up display in the headset to provide them visual cues — or audio cues to say, “The object you’re looking for is located here, and you need to move in this direction.”


That’s great, I think we could talk for a very long time about all of these in any one of the questions I’ve asked you, but I know our time is a little short today. So maybe just last comments about how do people get involved in this?  What would you recommend?


Yeah to your point earlier, talking to people, talking to employees, actually engaging people with disabilities into the decision-making process and in the design process as well. So if people are thinking about changing their workflows or how they conduct business and moving more towards XR technologies, I think it’s important to consult with their employees to really learn from them and say, “Hey, this is what we’re thinking how might this change impact you.” How can we improve the process so that people have an equal opportunity to have a voice or to have a say in what’s going to go on in terms of what their workplace accommodations are going to be?

You know, I think in terms of engaging with people in the broader community, so for example, engaging with, like, XR Access and other communities, other grassroots organizations could be really useful ways to learn more about what these technologies can do and also just become more aware of the potential pitfalls You want to be able to know beforehand what — what are the potential harms that could occur?  And if you know those beforehand, you can use — you can go through the process to help yourself mitigate what those harms can be so your employees can have the best experiences possible.


Well, thank you for that.  Those are really wise words.  I really, really have enjoyed having the chance to speak with you today. And again, we could go on and on.  So there will be other occasions, I’m sure. But really thankful to have this time with you, Martez.  And really appreciate all of the work that you and your colleagues do to help improve XR accessibility.


Thank you, Bill, for having me.  This is great.  And I’m really looking forward to learning more from other people in the panel, as well.  So thank you.


Okay.  That was our first featured video. And next we’re going to hear from our second guest speaker, and that’s Dr. Tina Park, who is with the Partnership on AI.

[Interview with Dr. Tina Park]

Thank you so much for joining us today, Tina. I’m really delighted to have this chance to speak with you. Would you please — let’s start out by just having you introduce yourself.


Fantastic. I’m so happy to be here, and thank you so much for the invitation.

My name’s Tina Park. I am a research fellow at the Partnership on AI, which is a multi-stakeholder-based nonprofit. We try to address issues of fairness, inclusivity, accessibility, and other ethical concerns that are related to the development of AI and machine learning technology. My project specifically looks at how these products and services can be developed with the involvement and input of a more diverse and inclusive set of people and their experiences.


Great, thank you for that.  It’s a fascinating role, and I’m sure you’re enjoying that.  It sounds great.

I think what we wanted to do when we start out, too, is, as mentioned earlier, AI is developing really rapidly and is really already very prevalent and will become more so in the workplace. So can you give us some easy-to-understand examples of what we mean when we say AI in the workplace?


Absolutely, I mean one that’s been around for a while and we probably interact with more regularly is OCR or Optical Character Recognition. So, you know, it converts content in, say, a PDF into a searchable and editable text. We probably use it very regularly. Other common ones that are becoming popular are speech-to-text tools that are built into virtual meeting platforms like the one we’re on that help process human speech and autogenerates captions and transcripts. We’ve all also probably interacted with a chatbot at one time or another.  Especially for customer service purposes. But we’re also seeing those come into the workplace and to help automate, you know, Q&As with — that employees might have around certain processes, or even, you know, looking up people in the directory.  It just eases those kinds of interactions that people can have. And then obviously for HR professionals, people are familiar with the recruiting and candidate screening tools that, you know, help us process the applications much more quickly.


Yes. Yeah, those are great examples.  Thank you.  I think it helps us to understand really AI is in every — in everything almost right now.  And will, again, be more prevalent.

And I think as we look at that, and we look at wanting to make sure our workplaces are as inclusive as possible, for all types of people, regardless of race, gender identity, or background, or ability or age. What are some of the challenges that might happen when we’re thinking about AI workplace tools, for example, for people with disabilities?


I mean, I think first and foremost, it’s important to note that, you know, all of these AI-enabled tools are still evolving and improving. And it’s also important to remember that they are not necessarily built for people with disabilities in mind. Part of my project is trying to change that and developing better processes for developers to think about the needs of diverse communities that will be using those tools. But right now we’re still seeing challenges in terms of how people with differences can really access and use those tools. So we need to make sure that we take additional steps before we adopt any of these things into our own workplaces. I think, you know, we really need to first begin with thinking about what actual challenges are employees facing in the first place.

For example, not all disabilities are visible.  So we need to be mindful of that full range of accessibility issues that people might be dealing with. Physical differences, such as wheelchair use, visual or hearing impairments may require different solutions than those who experience neurodivergence, so things like mental health, learning, or cognitive development differences. And what’s great about technology, and especially AI-enabled technology, is that it really allows employers to create more individualized strategies for employee support without having to make massive resource investments.

And so, while it’s important to recognize that technology can’t address every challenge for every person, it’s — we need to focus on trying to align those challenges and solutions as fast as possible and using these tools that we have on hand. So if you think about things like auto captioning or transcription tools in the workplace, it seems like a very obvious and easy way to support people with hearing difficulties. But as it stands right now, most text-to-speech tools have a lot of difficulty with non-English languages. You know, speech patterns and accents that don’t conform to very specific American English dialects. And they are not able to keep up with discussions that include a lot of industry-specific or complex jargon.

AI tools are only as good as the datasets that they are trained on, and datasets right now are currently really limited to data taken from American or Anglo able-bodied cis heterosexual White men. This bias in AI development datasets is really evident in other applications like those applicant recruiting and screening tools. You know, what characteristics are being prioritized in terms of what we can consider “the ideal candidate.” Are these characteristics coded in such a way that actually excludes women, racial and ethnic minorities, people with disabilities, or even people from less-elite institutions or work histories, for example.

So I would say any new technology that’s adopted, it really needs to be done with an active consultation and input of those who are ultimately going to be impacted by its use. If you’re trying to address potential barriers in the workplace, talk to those employees who are most likely to face those barriers. You can start by creating and supporting those internal ERGs, empower them to identify issues within the workplace, and work with them to adopt and test new technology that actually fit and, you know, actually address those issues. You know, if you’re thinking about adopting technology to improve everybody’s effectiveness and productivity, pilot it with diverse sets of employees, especially those of different genders, racial and ethnic identities, ages, people with physical, sensory, or cognitive differences to make sure that they aren’t being negatively impacted by the adoption of that technology.

Those are great recommendations, and I think about things like inclusive design.  We can also design the implementations of what we’re buying and making sure we’re involving the constituencies that will use them and make sure they are diverse.


So I think from what you’re saying you can talk to your suppliers and vendors, ask them these questions, and then run pilots; right? There are some things that can be done. What are some things that companies can do to sort of — we see responsible tech movements and ethical AI.  There’s a lot of resources.  So what are some things that companies can actually do to address what you’re talking about?


You know, yeah, it’s really important to talk to your vendors. Ask lots of questions. Including things about how was it designed? What were their imagined use cases? How did their overall development plan look like? Was the technology developed with diverse teams in mind? Were they involved with the designing and testing of it? And again, when you talk to your own employees, they can identify issues that can help you formulate those really good questions that you can put to your vendors so that you can think things through before you actually adopt them and implement them in your workplace. You can work with them to see if there are different configurations that you can try out that can better adapt to the needs within your own organization. Also, you know, definitely talk to HR, legal, and compliance, make sure that you’re following the laws within your own region and country and to make sure that your employees are being supported.

And ultimately, you know, these workplace AI technologies should be really scrutinized and approached with a fair amount of caution.  They seem like a really quick fix to some of the major issues that your organization might be facing. But implementation of a technology that is not quite ready for supporting the people within your specific group may generate more complicated issues to navigate down the line. In some cases a more low-risk or simpler technology might be the better solution, so it’s worth considering all of those different options.


I think that’s great, and what I think of is levels of risk; right?  If I’m implementing an app that’s well-established that helps as, like, an accommodation tool like Seeing AI or something that’s a pretty narrow thing that’s been tried and true. It’s established.  It’s been tested.  Whereas something like a hiring platform has multiple components to it. And so that’s what I like about what you’re saying.  And I know we could go on and

on and talk about a lot of this.  And we will continue to explore these issues. But I want to say thank you.  And last just ask you how can people stay in touch with what you’re doing at the Partnership on AI?


Definitely keep updated on our website on partnershiponAI.org. Including my project, which is the Methods for Inclusion project, but also work from my colleagues, like Dr. Jeff Brown is running our diversity equity and inclusion attrition project. So if you’re concerned around why people of diverse backgrounds are leaving your organization, this will be a really great thing to keep track of.


Great. Well thank you so much for joining us today and for all of your excellent commentary and perspective. We really, really appreciate it.  And we look forward to continuing the discussion with you, Tina, thank you.


Thank you so much.

[Audience Q&A]


Okay.  So I want to say thank you to both Martez and Tina for those excellent discussions. And I really enjoyed interviewing both of you.  And now we’re going to have time for some audience questions. So we’ll take a look and see.  And we’ve got someone, Devin, you’ll be helping us with I think our first question?


I will.  Thanks for the introduction, Bill. Hi, everyone, I’m Devin Boyle. I’m with the Partnership on Employment and Accessible Technology with Bill.

I just want to flag that we did get a request to speak a little slower. So I’m going to try my best to do that.  And thank you for that flag.

So the first question we have is from the audience. And I’m going to direct this one at you, Martez. So thank you, Natalie, for this question.  It’s a great one. So it seems like — it seems some people are getting tech-weary, Zoom-weary. We really need to know how to stay connected. So Martez, how do you think we can use XR as a way to stay more connected to our colleagues?


Yes, thank you, that is a great question So I think the opportunities that we’re starting to see now, especially since remote work has become more commonplace over the last year and a half, is companies and organizations are really starting to think about how they can give people more flexibility in where they work but also allow their employees to still remain connected like you mentioned. I think XR is actually uniquely positioned to help facilitate that type of interaction. So for example, in some of the images that Bill showed earlier on in the presentation, you can imagine having these hybrid workplaces where some people are in the office, some people are remote. And they can still perhaps have a presence with people, even though they may not be there. So you can use different type of hologram technology or different types of virtual representations through different avatars to allow people to still have a sense of presence, even though they may be remotely located elsewhere.

And I think that there’s still a lot of research and work that needs to be done on how to ensure that those applications and those scenarios are made accessible to people with varying abilities, because you don’t want situations where people feel left out or people feel like they don’t have the option, for example, to work remotely because these technologies aren’t built for them or they are not designed well with whatever abilities that they have.

So I think there’s an opportunity there to allow this kind of sense of co-presence with people who are in person and remote, but I think we really need to nail the details to get that correct in order to make sure that we can have an inclusive and equitable experience for everyone.


Thank you, Martez. I appreciate that.

I’m going to jump in. We had some questions come in before the webinar, as well, and I wanted to send one over to Anne, if you don’t mind, Anne. So how do you envision requests for accommodations changing as emerging technologies enter the workplace?


Well, I could see that they could change in a number of ways, especially with the use of potentially AI. We are hearing from some employers that they are making, say, an accommodation that costs $500 or less can be automatically approved. And maybe they will have some type of chatbot that the applicant or employee would be using to get more information about that type of accommodation. And to what Martez was just speaking to, not only just a request for the accommodation, but I can see how the technologies Martez was just discussing might be involved in the Interactive Process if it was helpful to have somebody on the work site and then somebody with them virtually to look at something specific in the work site and still have that presence like they were meeting together to facilitate that Interactive Process. And that’s just a start.  I think there’s a lot of potential here.


Thank you, Anne, that was a really good point.

I want to direct a question over to Tina. Tina, how can organizations involve and actively consult with people with disabilities as well as other diversity groups as they consider adoption and implementation of AI-enabled workplace technologies?


Thanks so much for the question.

I think, as Martez and I both have sort of echoed, centering the person that it’s supposed to serve is always most important. So those ERGs are really key. They are your sort of first line of folks that can really provide the input that you need to make those important key decisions. If you don’t have those ERGs, that’s a really great place to start.  I know from my colleague Dr. Brown’s research, we found when those ERGs are really supported by management and have resources, that’s when they are the most effective. So if you are in those kinds of roles that can really support those spaces, definitely do so.

I think also thinking intersectionally about how those spaces work is really key. Other axes of social identity like gender identity and race can impact things differently, so make sure that those ERGs are diverse, whether it means the group that serves your Black employees has — make sure it’s inclusive of folks with disabilities.  And vice versa.

I think also working with disability-led consulting groups and organizations to help you identify resources, trainings, and workshops is a great way to inform yourself and the rest of the organizations on how you can make those kind of changes. And then work with groups like the American Association of People with Disabilities. They have launched a Start Access Initiative which is really looking at how AI technology and other technologies can be born accessible. So keeping up to date with the recommendations that are coming out of groups like that. Then check within your own domain-specific professional organizations. See if they have advocacy and special interest groups that are dedicated to people with disabilities, because they might be able to speak more specifically to the needs within your own industry and provide those kinds of recommendations as well. Basically reach out, find the people who have this expertise, and get their input as regularly and often as you can.


A — thank you, Tina.

Just quickly somebody had asked in the chat what does ERG stand for. It’s Employee Resource Group. Did I get that right, guys?


Thank you, yes.


Thank you.

Okay. We have a quick question. I’m going to throw this one back to Martez. Can screenreader software — this person wrote the example JAWS — read holograms? So I think the question is talking more about broadly are screenreaders developed or being developed that can read holograms?


Yes, that’s a great question.

So I think that’s something that is a current gap in our capabilities at the moment. There’s a lot of research that I know of that’s ongoing that’s trying to understand to this question, well, how can we provide better screenreader opportunities for these virtual environments? So whether they be holograms or other different types of virtual materials that you might place either in the physical space through a hologram or, if we’re collocated in a virtual space, how can we provide the correct amount of information? So there’s a lot of questions, for example, about what metadata is necessary for the screenreaders to have access to to allow people to get this information? What do you do when there’s multiple objects or dynamic objects that can be changed or updated within the course of an interaction? How do you provide that information in real time and at the correct level of fidelity so people don’t feel either overwhelmed or feel that they don’t have enough information to make an accurate, you know, assertion about what’s being said or what’s being done?

So it’s a great question, and I think it’s something that I think the research community has identified as a huge problem, but my hope is that, especially in the coming months and coming years, we will have some solutions to address these problems. And then hopefully really good solutions before these things really become more mainstream and start to emerge in more workplace and social and educational environments.


Thank you, Martez.

I’m going to throw one more question at you, Tina. So who should have the control over the AI-enabled technologies? Employer or disabled employee? Employers tend to be controlling over the autonomy of the employee in most cases.


Yes, and thank you to the attendee who posed this question. I think it’s such an important one to consider, because I think while we’re all here talking about the potential for AI technology to really assist the worker and the employee, I think we bring up a great point that oftentimes the inclusion of technology is about trying to monitor the productivity of workers, as well.

So we have to think about what purpose is the technology serving and to whose benefit is it? You know, we believe that the control needs to be shared over both.  I think that’s why we’re emphasizing the involvement and inclusion of employees with disabilities and those who are affected by it to be involved throughout the decision-making, selection, and adoption processes. They should be there to audit. They should be there to, you know, determine what changes need to be made. This can’t just be a, you know, one-time point of interaction where you speak to employees to get their input, and that leaves it.

This is why having those sort of organizational structures in place where employees can effectively advocate for themselves and communicate upward and outward to the rest of the organization what their needs are and how they’re not being met is really important and establishing that kind of trust and relationship is necessary to make sure these technologies, you know, do what they are intended to do, which is to support our employees in the workplace.


Thank you so much, Tina.

So we’re out of time for Q&A, but we will respond to the questions that came through. Thank you, everyone, for asking these great questions.

So I’ll throw it back over to you, Bill.


Great, I want to thank you, Devin, for moderating our Q&A, and also to our guest speakers, Martez and Tina, excellent points you made here. We could go on and speak about this for hours, I’m sure.

I’m just going to pause and say please check out PEAT resources. We have our website at PEATworks.org/futureofwork and some podcasts, including on AI and XR, and then some of the community work we’re doing includes the community mentioned in Martez’s interview, XRaccess.org, and of then course check out the PartnershipOnAI.org. You can email us at info@PEATworks.org, and then I would like to invite each of you to the next event we’re planning in the community. The 2021 XR Access Symposium is free and all-virtual on June 10th from 10 a.m. to 3:30 p.m. Eastern. And you can register by going to XRaccess.org/symposium. And we invite you to attend. Also this summer PEAT will be releasing an equitable AI toolkit targeted towards employers, and some of the topics we discussed today will be shared in that resource.

I’m going to turn it back over to Anne to close out our webinar.


Thank you, Bill. And with excellent resources, we really look forward to seeing those. We want to extend a special thank you to all of our speakers for sharing your experiences with us today. Thank you also to everyone for attending this JAN and PEAT training event.

If you need additional information about the topics shared today, please contact JAN. Go to AskJAN.org for contact information or at PEAT at the information that Bill just provided.

As mentioned earlier, an evaluation form will automatically pop up on your screen in another window. We appreciate your feedback so we hope you take a minute to complete the form.

And finally, please join us again on June 8th at 2 p.m. Eastern Time for our next JAN webcast, which is Accommodating Public Safety Workers with Disabilities.

Again, thanks for joining us, and have a great rest of your day.