Lydia is Policy Council for the Center for Democracy & Technology and the Director of Policy, Advocacy, and External Affairs at the Autistic Women & Nonbinary Network. In this video interview, Lydia dives into the accessibility of Artificial Intelligence (or AI) and the implications of AI and machine learning for people with disabilities.

Video transcript

PEAT. Building a future that works.

Hi, I’m Josh Christianson, co-director of the Partnership on Employment and Accessible Technology or PEAT.  I’m meeting with people with disabilities from across the country to reflect on the role technology has played in their careers and the potential it holds for the future of work. Can’t wait to dive into my conversation with Lydia X. Z. Brown on the accessibility of artificial intelligence or AI and the implications of AI and machine learning for people with disabilities. I’ll let Lydia introduce themselves.

This is Lydia X. Z. Brown. My pronouns are they and them. I’m currently the policy counsel for the privacy and data project at the Center for Democracy and Technology. I also work part-time as the director of policy advocacy and external affairs at the Autistic Woman and Non-Binary Network.

Talk a little bit about some of your concerns about machine learning and artificial intelligence as it’s used in technologies and what could people do to mitigate those concerns. 

AI and machine learning are only ever as good as the design for which they are created. The data that is used to feed, train and calibrate a particular application and the purpose to which it is ultimately deployed. Using AI as a way to make decisions about who to hire or who to assess for a bonus or a raise or a promotion or how to structure your workflow or communication among employees in a particular team or across team can seem very promising, because you think we can automate these functions that can be difficult, tricky, and sometimes hard to do with an individual person making that assessment. But the reality is that many of those applications rely on approximations for human behavior and for human activities that don’t line up with reality, that don’t account for the infinitely varied ways that disabled people will never be codified as the norm or the standard and that will then inevitably result in penalizing and discriminating against disabled people for a number of reasons that may not be readily detectable. I always encourage people who are designing or launching any product, an application, a website, whether they’re thinking of the visual display or they’re thinking of the user experience of actually navigating a particular process or a particular pathway, to go to the drawing board and start there, in collaboration and partnership, with multiply marginalized people. Eye tracking software for example not only raises important privacy concerns about people’s ability to protect their privacy but also does not work for all people. Blind people, autistic people, people with cerebral palsy, and people with any number of other acquired or born disabilities will not have eye movements that would be considered standard. And so even if an employee with one of these disabilities is sitting at their desk for three straight hours actually doing work, an eye tracking software that is attempting to evaluate their performance will probably not accurately capture what their performance is like.

Lydia, thank you for sharing that information and for your time today. Much appreciated.

PEAT. Building a future that works. Peatworks.org.

PEAT is funded by the U.S. Department of Labor’s Office of Disability Employment Policy under contract no. 1605DC-19-F-00213/P00002. PEAT material does not necessarily reflect the views or policies of the U.S. Department of Labor.