Jeffrey Brown, Diversity and Inclusion Research Fellow at the Partnership on AI, discusses strategies for using AI-enabled recruiting technologies in ways that enhance diversity, equity, inclusion and accessibility (DEIA) priorities.
Intro: [00:00:00.99] Welcome to the Workology Podcast, a podcast for the disruptive workplace leader. Join host Jessica Miller-Merrell, founder of Workology.com, as she sits down and gets to the bottom of trends, tools, and case studies for the business leader, HR, and recruiting professional who is tired of the status quo. Now here’s Jessica with this episode of Workology.
Jessica Miller-Merrell: [00:00:26.22] This episode of the Workology Podcast is part of our Future of Work series powered by PEAT, the Partnership on Employment and Accessible Technology. PEAT works to start conversations around how emerging workplace technology and these trends are impacting people with disabilities. If you’ve been on this podcast or listened to it for a while, you know I love the topic of artificial intelligence, and today I’m really excited to dive in more around diversity, equity and inclusion, and accessibility when it comes to AI. Today, I’m joined by Jeffrey Brown. He’s a psychology researcher at the intersection of tech, identity, mental health, and education. Jeffrey is a Diversity & Inclusion Research Fellow with the Partnership on AI, as well as an Assistant Professor of School Psychology at San Diego State University. Jeff, welcome to the Workology Podcast.
Jeffrey Brown: [00:01:18.48] Thank you. Thank you so much for having me.
Jessica Miller-Merrell: [00:01:20.25] I’m really excited about this conversation today. But before we dive in, can you tell us a little bit about your background and how you got involved in your work around artificial intelligence and diversity and inclusion?
Jeffrey Brown: [00:01:33.00] Oh, sure. I come from a psychology background, as you said, doing decision-making research back in my undergrad days. So, shout out to Louisa Eagan, Dr. Louisa Eagan and Dr. Paul Bloom for helping me get into research as a baby college student. From there, I went to grad school to study psychology, social psychology, and school psychology, focusing on youth and families of color and I, all along, I was especially interested in mental health and how our environments or how their environments could either bolster or affect mental health outcomes, particularly in communities of color, and their academic and professional outcomes as well. And from then I became an assistant professor at Minnesota State University, Mankato, and really broadened my focus to look at how systems of discrimination, in general, affect folks who are marginalized in our society. And that is, especially gender and sexually diverse folks and people of color and intersections therein. And you know, something I noticed kind of switching to AI are, transitioning to AI, something I noticed I was doing more work with schools and communities is how pervasive tech was, has become. And, you know, more things are becoming automated. AI became a buzzword and it first started out as a curiosity from a technical side. So, you know, I started out mainly focusing on quantitative research. And, you know, I was interested in the, you know, the fancy machine learning algorithms and things like that. But then I quickly started to think about ethical outcomes, so it’s predictive modeling, in working with folks who are already marginalized.
Jeffrey Brown: [00:03:17.10] I started to think really quickly about how some of these algorithms, first of all, who’s designing these algorithms? And second of all, what are they using it to do? So, as someone who trained, was trained from the people side of things, being a psychologist, it was like a light bulb really went off when a colleague sent me the posting for this research, diversity and inclusion research position at the Partnership on AI to look at the work environments of people of color and how this might be contributing to inclusive AI practices. And the work that I did was funded by DeepMind. It was specifically to look at attrition of diverse folks, people of color, folks who were marginalized along, you know, various lines of identity, and AI specifically. And if I may just say a little bit about the Partnership on AI, or PAI. At PAI the goal is to bring diverse voices together across global sectors, disciplines, and demographics so developments in AI advance positive outcomes for people in society. And so I took one look at their mission and I said: Yeah, that’s, that’s what I, that’s what I want. That’s what I want to apply my skills and my work towards. And you know, I found my vision aligned with theirs. You know, a future where artificial AI or artificial intelligence empowers humanity by contributing to a more just, equitable and prosperous world. So, I started the position and found that, yeah, this is, this is where I, this is where I need to be. This is where I should be housing my research.
Jessica Miller-Merrell: [00:04:51.96] I think this is a great area for you to be working in and one that I think about a lot and I think about the adoption rate of artificial intelligence in the workplace, the use by everyone within an organization, but especially in HR, and that number is growing every day. The number of HR technologies we are using as HR leaders is increasing because of remote work and our reliance on, you know, cloud and data storage and those kind of things. So, it’s important for us to think about how is this artificial intelligence technology being created? Who’s writing the code? Is it discriminatory? And if it is, what can we do to fix that?
Jeffrey Brown: [00:05:41.48] Definitely. Exactly.
Jessica Miller-Merrell: [00:05:43.65] So I mentioned that HR leaders have been thinking about artificial intelligence. It has been the topic of conversation in our space for a number of years, but it’s still relatively new in terms of adoption. I wanted to ask you if you could maybe talk to us about how artificial intelligence in the workplace would help us support diversity, equity and inclusion and accessibility priorities. And that is for those that don’t know, commonly referred to as DEIA.
Jeffrey Brown: [00:06:12.38] Yeah. So this is a great question, and please, please follow up if I’m sidestepping it a little bit. So, AI has primarily, as you pointed out, been used in a recruitment context to screen candidates. That’s the most maybe common use case for it. There are other ways, but this is by far maybe the most common way that AI in HR has been used. So, someone could write an algorithm, for instance, to select for folks with a certain number of years of experience as a software engineer and put in several other parameters to their algorithms and so on and so forth. And I think there are two important things to mention about that. One is that AI algorithms are, as you pointed out, they’re ultimately written by us and we are biased humans. So, you know, it sounds like, I guess, maybe at the start, if you say: Oh, it’s OK, we have AI taking care of that. It’s really important to, the assumption is that AI is somehow fair, it’s objective. It is unbiased. However, these, these are algorithms, they were written by us and we are biased. We are flawed. We are, you know, we discriminate. We’re human. So ideally, AI would be used in concert with folks who are already trained in DEI best practices and, or DEIA best practices. And of course, that falls under the umbrella of HR. Does that make sense? So, it’s like, sure, AI could help us, you know, in HR, but we can also help AI, or HR can also help AI. You know, AI could streamline certain processes, but they can’t replace good evidence based DEIA practices.
Jessica Miller-Merrell: [00:07:58.94] Agreed. It’s not a Band-Aid for an already bad, maybe, situation in, in your workplace. And I want to link to this in the show notes. Probably one of the most popular in my mind, like artificial intelligence case studies gone wrong is the Amazon case study with, with some of their technology. So go to Workology Podcast show notes, WorkologyPodcast.com. Take a look if you’re unfamiliar with Amazon and their sourcing technology using artificial intelligence that was found to be discriminating against, against candidates who were minorities, particularly women and people from, not from certain universities and persons of color.
Jeffrey Brown: [00:08:45.71] Exactly. And you know, that is, that’s a perfect use case. I mention it in my report that’s coming out, but you know, I don’t work for Amazon, so I don’t know what’s going on behind the scenes. But you know, I would hope or I would bet that they had folks who are trained in DEI issues really looking at that and seeking to, you know, make sure that that would never happen again. You know, so that’s like a perfect example of, well, it’s how do we improve AI?
Jessica Miller-Merrell: [00:09:16.84] Agreed. Well, I can’t wait to dive in more on this topic. You mentioned recruiting and diversity recruiting in particular, and I think there’s a lot of opportunities we think about artificial intelligence and how this might be used to enhance our DEIA efforts, whether it’s internally or externally, to recruit and retain those diverse candidates, but finding continual challenges for us and, in retaining these diverse hires. I think a lot of individuals, heads of HR are going through this right now. Can you talk to us about your research about why maybe this, this is happening?
Jeffrey Brown: [00:09:54.46] Sure. So basically, what I heard from participants and also kind of on the heels of other studies is that it was much easier to recruit folks in a tokenistic way. So, you know, not thinking about the overall norms of the organization or who created those norms in the first place. Does that, does that kind of make sense? You know what I’m saying? It’s like if the leadership has treated DEI, DEIA as just a checkbox rather than a priority, chances are the workplace will reflect these values as well.
Jessica Miller-Merrell: [00:10:32.71] I like that and, and from a number of our other podcast interviews, this, what your findings are and the conversations they’ve been having are reinforced in that way. It’s, it needs to be our priority and it needs to start from senior leadership down.
Jeffrey Brown: [00:10:50.41] Oh, really, you’ve heard from other folks kind of echoing the same thing?
Jessica Miller-Merrell: [00:10:53.77] Yeah, and I love that. Of course, you know, it’s, mine’s more antidotal in that I’m having conversations about DEIA. You are a researcher. And this is, what you’re doing on, on the side. So I, so I am seeing the same, the same conversations to be had. It’s, again, it’s not just a checkbox. You can’t just say: Hey, we’re going to institute or create a bunch of ERGs and suddenly you are retaining your diverse workers. It’s not a single solution that’s going to be able to do it all.
Jeffrey Brown: [00:11:28.09] Exactly. And you know, it’s, you know, even though I’m kind of immersed in this research, it’s always good to hear when, when people say: Oh, by the way, so-and-so is telling me about this too. People are affected by, or people, real, these real stories affect change. So, it’s always really good to hear that folks in the field, working in the field, it’s like: Hey, you know, this happened, as kind of, as kind of examples.
Jessica Miller-Merrell: [00:11:54.06] Yeah, I have a, I have a great podcast interview from the head of talent for BarkBox, and she also oversees Diversity Equity Inclusion Program, including diversity sourcing and hiring and recruiting. And they are doing phenomenal work in, in this area, even before she was brought on. But it is not just one pronged, it is multi-pronged and multifaceted and ongoing.
Jeffrey Brown: [00:12:26.18] Well, now I know what I’m going to have to listen to while I’m doing my report edits that I just got back.
Jessica Miller-Merrell: [00:12:31.43] Yeah, I’ll, I’ll send you the, the link over and you can take a listen. It’s, it’s probably one of my, my favorite DEI practitioner interviews that I’ve done thus far on the podcast.
Jeffrey Brown: [00:12:45.35] Thank you.
Jessica Miller-Merrell: [00:12:45.92] I wanted to talk to you about another area, maybe about what some companies do you feel like or you’re seeing in your research that are doing the right thing when it comes to employee retention from those underrepresented groups. And this might include diverse racial and ethnic backgrounds, ability, sexual orientation, and gender identities.
Jeffrey Brown: [00:13:05.66] So, I’m so glad you asked this question because it’s really easy to kind of look for the problems and look for the challenges, and there is obviously a whole lot of value in that. But then the question you get after that is: OK, now what? We know the problems, what are we going to do? So fortunately, in my research we asked, we specifically asked that, we asked DEI practitioners, but we also asked folks who are managers on AI teams and diverse folks on AI teams as well. Well, what is going right? What is your company and organization doing right? And we got a lot of responses, because I found that it can be a little bit tricky or challenging, to find kind of publicly available information about what companies are doing right. But I’ll give a couple of examples, and these examples kind of are somewhat representative of some of the themes I heard in my interviews, in my research. So, one example was a program that recruited, a tech company that recruited from professional backgrounds that had a built-in mentorship program, recruited from diverse professional backgrounds that had a built-in membership program and also a cohort model. So, they recruited folks who were, you know, more maybe, quote, unquote, wanted to be on, work on, quote, unquote, the more technical side. So, think of software engineers or stuff like that.
Jeffrey Brown: [00:14:29.84] And they’re able to receive, the folks who were recruited were able to receive mentorship from others in the organization and, and those mentors, in addition to giving mentorship, they also received training and supervision from others in the company. So, you know, even though mentorship is a thing that, you know many of us kind of do naturally as we become more and more senior in our careers or what have you. You know, there is, it’s difficult, right? I’m, you know, you’ve, I’m assuming you’ve mentored people and it’s hard, right? To do it, to do it right. So those mentors were also able to be trained. So, it’s, it’s very time and money intensive for the company. But guess what? They found good results. They retained those folks and, you know, the folks who were, who took part in the program, they were able to, you know, maybe tell their colleagues or friends, also from diverse backgrounds that: Hey, here’s this really cool program. It’s great. They look out for you and things like that. So that’s like one example of something really specific that a company had put a lot of effort into and it went well. That being said, the program also faced a lot of pushback. So, it wasn’t all sunshine and rainbows, and the folks leading the program had to fight for it tooth and nail for them to continue to get resources and things like that.
Jeffrey Brown: [00:15:55.85] But that’s what happens when, you know, DEI efforts are seen as competing with other parts of the company or teams for resources. So that was one good example. We heard a lot, almost universally that ERGs were helpful and we actually, the PAI hosted a workshop following the study by preceding the results of the study being public, with folks interested in engaging with some of the findings of the study before we publish them. And we heard that the story was more complicated about ERGs from participants. Essentially, that ERGs would start out really great. It would start out as this kind of grassroots, even though it was sanctioned by the company, not sanctioned. But even though it was, I guess, sanctioned by the companies, once they became bigger and bigger and got more and more support, some companies tried to give them more oversight. Maybe try to influence them in a way that wasn’t for the best of the people in the companies. So, the story about ERGs is more complicated. ERGs are potentially extremely useful. But as you said early on in the podcast, Jessica, it’s not necessarily a box that you could check: Oh, we have an ERG for black people, or for Asian people or for people with disabilities. Check, check, check. That’s not enough. We, they, they need to be really employee guided, but also very well resourced so that, that gets tricky and complicated.
Jeffrey Brown: [00:17:29.56] And I want to specifically talk about ability here because that is, as we all know, that is maybe an axis of identity that often gets ignored. And what I heard from folks is that proactive measures for accessibility and also creating a culture in which, you know, different folks are accommodated or different policies are made to maximize accessibility for everyone, where that is normalized, that, where it’s more proactive, right? That is, that it’s for the best. So that is good for the employee. But that’s also just like good for the company and just the right thing to do. One last thing I’ll mention because I could talk for, for forever about what, what companies should do is to make sure that there’s a clear and supportive pipeline for folks who have complaints. So, this might be the less, the less cute side of, of this research. But unfortunately, there are still instances of racism. There are still instances of sexual harassment, obviously. But there needs to be, or gender-based micro-aggressions or gender-based prejudice or discrimination. There needs to be a very clear, supportive pipeline. And this is where there is maybe a tension between HR and DEIA, whereas HR could step in and maybe be perceived as protecting the company versus protecting the employee. But I’m sure we could talk a little bit more about that.
Jessica Miller-Merrell: [00:19:05.68] But like all these, this question and all the stories that you’ve shared, we could make a checklist from this and it’s like, here’s foundationally what you need to, the considerations you need to make and some ideas to kind of get your DEIA programs going. But you are absolutely right. I love human resources. This is my chosen career. It chose me. I didn’t choose it, but I have embraced it. We are not always seen as an advocate when it comes to the employee. We try to be. I think most of us here do operate with good intentions, but at the end of the day, we are employed by the organization, and we report to ultimately the board of the directors, or the CEO.
Jeffrey Brown: [00:19:55.60] Exactly. You hit the nail on the head.
Jessica Miller-Merrell: [00:19:58.62] But I love all these suggestions that you made and the research that you’re doing is, this is going to help so many, not just HR leaders, but CEOs and employees who are looking to maybe add more accessibility or have a better relationship with HR or have a, start, continue, grow, change, evolve their ERG or other DEIA programming. This is, this is going to be a great resource for, for anyone to be able to have access to in the workplace.
Break: [00:20:35.51] Let’s take a reset. My name is Jessica Miller-Merrell, and you are listening to the Workology Podcast and it’s sponsored by Ace the HR Exam and WorkologyCouncil.com. Today, we’re talking with Jeff Brown about diversity and accessibility when it comes to artificial intelligence or AI technology. This podcast is part of our Future of Work series powered by PEAT, the Partnership on Employment and Accessible Technology.
Break: [00:21:01.17] The Workology Podcast Future of Work series is supported by PEAT, the Partnership on Employment and Accessible Technology. PEAT’s initiative is to foster collaboration and action around accessible technology in the workplace. PEAT is funded by the U.S. Department of Labor’s Office of Disability Employment Policy, ODEP. Learn more about PEAT at PEATWorks.org. That’s PEATWorks.org.
Jessica Miller-Merrell: [00:21:29.89] I want to switch gears a little bit and talk about HR technology. Can you talk about what we should be looking, when it comes to vendors because there are a lot of different HR technology providers and vendors in the space, and when we are thinking about implementing or considering implementing an AI technology, what should we be looking for, for HR or hiring? And I also wanted to ask, how can we ensure that we engage diverse vendors because your research that you talked about at the beginning showed that if you have diversity, your AI hopefully is more diverse. But how can we figure that out? How do we know that the artificial intelligence team and the vendor is more diverse? These kind of things are, I guess, the questions that are keeping me up at night.
Jeffrey Brown: [00:22:26.04] Oh, yeah, I’m really also happy you asked this question. And, you know, I heard from folks again talking about the maybe the, the dark side of it or maybe the more unpleasant side of these conversations or difficult side of these conversations. You know, I heard from folks whose, who talked about their companies having clients or vendors who made sexist or racist comments towards employees. And obviously, that is like, that is maybe like an obvious, obvious suggestion. Don’t engage with folks who make these kinds of comments, but I think it also speaks to the, the priorities and the values of the organization. So, do you want business at any cost? Like these kinds of things. Or is, are your actions guided by what you prioritize and what you value? And is inclusion, is belonging, are those, are those part of your values? And if so, then it should be easier for you to really have an eye for selecting vendors, for instance, who also prioritize these values. So that doesn’t quite answer the question. But having someone on the team, you know, folks like who are doing the work like I’m doing, or folks who are doing the work like my colleague Tina Park, who, Dr. Tina Park, who’s focusing on inclusive practices in artificial intelligence and what various folks are doing to, you know, having these conversations with folks on the team who specialize in this sociologists, anthropologists, psychologists, diversity inclusion professionals, who not only, have discussions in-depth with these vendors about not only how they approach their work, but also what their values are and how they, how they live their values.
Jeffrey Brown: [00:24:24.54] Folks who are, you know, maybe well trained in this kind of work, you know, we, we can sort of tell when, when, when, if a vendor or if a third-party person, we can tell if they’re kind of faking it or kind of paying lip service to DEI conversations rather than actually believing in it. You know, not always, but I would say folks with this kind of training, with this kind of research in their belt, they are better equipped, they’re well equipped to really suss out, you know, what the true values of these vendors are. Again, not perfectly, but it’s a start. So, I guess, long story short, I would say to make sure to have folks on your team, especially, especially researchers who are able to, with a strong research base, we’re able to kind of suss these things out.
Jessica Miller-Merrell: [00:25:20.52] Awesome. I feel like it also helps too if you’re meeting with the team of a prospective HR technology company to, to look at the visible diversity, and, who you want to hire. Yeah, yeah. I’m just saying like inquire. I mean, if you’re at an HR tech conference or some sort of conference and you can see the people at the booth. That’s great. Diversity might be there. But for me, I’d also and maybe you just point blank ask: Hey, what is, what is your engineering development team look like in terms of makeup? We’re trying to be more inclusive, and I want to make sure that this technology does, you know, these type of things. And maybe your question will be, should be focused on persons of color and other minority classes, women and definitely persons with disabilities. I feel like in all these different podcast interviews that we’ve done with PEAT, in partnership with the, with PEAT over the years, that’s really what we need to have. We just need to be forward and upfront and say: Hey, I’m trying to be more diversity-focused, inclusive and want to be focused on accessibility. What does your technical team look like? Who are they?
Jeffrey Brown: [00:26:38.10] One hundred percent. And you know, I would say, that’s just a baseline, right? It’s like, because a company can have kind of, kind of maybe include folks tokenistically, right? Versus actually valuing diverse perspectives and things like that. And that’s harder to suss out. And I don’t pretend to be like an all-knowing wizard in terms of that. But, but it’s definitely, that should be the baseline. I agree with you.
Jessica Miller-Merrell: [00:27:07.26] Yeah, it shouldn’t just be, hey, who’s in the conference booth or who’s on the website? You need to, to dig a little deeper. And I guess I would also say that we should not forget intersectionality here and, and how individuals can be in multiple protected groups.
Jeffrey Brown: [00:27:25.40] One hundred percent, exactly. And yeah, you know, in 2022, the intersectionality, that, that should be the new baseline, that should be the new, you know, we should be, in this industry, we all should be not just learning about it. You know, we should be reading. We should be making ourselves more aware of intersectionality beyond, beyond the baseline at this point. So yeah, I one hundred percent agree with you.
Jessica Miller-Merrell: [00:27:55.19] How can AI technology help support the efforts of HR and employers in recruiting and retaining diverse groups and including different races, ethnicities, nationalities, and not forgetting persons with disabilities, too?
Jeffrey Brown: [00:28:11.57] Yeah, that’s a good question. You know, I think, kind of harkening back to my, what I said earlier about, well, it’s, remembering that AI is not a replacement for us for good DEI, DEIA training. It’s not a replacement for, you know, shaping the priorities and the mission of the organization to really value inclusion and belonging. Having that as a start is extremely important. And AI comes from us, we design AI technology so we can design AI technology to support us in a variety of ways. But it’s more important to make sure that this technology is designed, designed well with inclusion in mind. But I will say now that, you know, you mentioned intersectionality, and this was really crucial to AI that this is, my research showed that this is crucial to AI teams as well. My research, maybe, actually suggests that intersectionality within teams maybe gets strong in inclusive teams, and also affects the work product by making the work product better. So, I have no conclusive data to suggest that diverse teams make, quote, unquote, better AI. But there is research out there, most famously, the BCG study kind of speaking to the business benefits of diverse teams. But I mean, I personally think that should be secondary. Like, but I’m not a, I’m not a businessman, I’m a psychologist. So, I’m like, who cares about like, yeah, yeah, it’ll be good for business, but it’s the right thing to do. But I would say, even beyond that, what came out in my interviews and which, this is also something that I did not necessarily predict going into the study. It was that folks, so diverse folks, so you know, again, along the lines of gender, along the lines of race, ethnicity, ability, sexuality, et cetera, folks who were in teams that not only prioritized diversity in identity but prioritized diversity in terms of professional or disciplinary background that was actually strong, that, that seemed to associate with a very functional and inclusive team.
Jeffrey Brown: [00:30:25.61] So teams, especially, you know, AI teams that prioritize: OK, well, we think that, you know, of course, we want folks of various races, ethnicities, genders, and so on, but we also want folks with a, with disciplinary backgrounds. We want folks who are sociologists, anthropologists, maybe folks who are maybe trained more on the people side of things. We want folks who are maybe experts on human-computer interaction who also study race. So that kind of disciplinary, interdisciplinary perspective that went beyond just the, quote, unquote, technical, just, quote, unquote, software engineering. Teams like that where folks who also happen to be teams that were diverse and also happen to be teams that were inclusive. So these, having these teams, having your team also come from a range of disciplines and perspectives, again, it can’t replace diversity along the lines of identity, but that’s something that’s really important to keep in mind as well. And that, I think goes way beyond the tokenistic diversity, that goes beyond, that goes into true diversity. And I do want to make it crystal clear, I’m not saying that we should hire, if we have a team that is, say, five people, but they’re all the same race and gender, but they all come from different disciplines that that’s, I’m not saying that that’s a replacement, but I’m saying that this is another dimension of inclusion that many, many companies are overlooking.
Jessica Miller-Merrell: [00:32:06.20] I agree, I feel like it’s another layer on the onion, you know? And so, it’s not, it’s a, a layer that’s further in the onion. The priority should be on race, ethnicity, nationalities, persons with disabilities, sexuality, all those things. But also having a diverse group in terms of experiences and backgrounds is also important, too, and makes a team more diverse and inclusive for that reason.
Jeffrey Brown: [00:32:40.67] Exactly. And you’d be surprised how often these things go hand in hand because, you know, again, I don’t have conclusive data on this, but folks who maybe belong to marginalized identities, they may be the ones who are asking questions more from an interdisciplinary point of view. They may be asked, they may be knowing, OK, my job is to work on this AI team or even this HR team. But I’m not only interested in HR as this kind of blanket slate, this blank slate concept, this abstract concept, but I’m interested in how this affects people who look like me, who, it affects people who look like my neighbors, people I grew up with, who, you know, I’m interested in looking at it from, that’s not just kind of the mainstream, traditionally white, middle-class Eurocentric lens. Does that make sense?
Jessica Miller-Merrell: [00:33:36.20] Absolutely. Absolutely. And I just keep thinking that I truly love that, that the conversation is starting with companies in the artificial intelligence space, but this is definitely something that we should be considering for ourselves and our own organizations. You don’t have to be in the AI space to be able to be more inclusive and diverse in, in these, in these different areas. But I think this helps with the ethics and the development of the technology that, the AI technology that is really going to be driving not just future business, but future life.
Jeffrey Brown: [00:34:18.86] Yes, exactly.
Jessica Miller-Merrell: [00:34:20.03] I want to go back just a minute. We did talk a little bit about intersectionality, and so I wanted just to ask you if you could tell us about your thoughts, maybe on HR’s role around intersectionality of diversity, equity, inclusion and accessibility. Maybe for those who aren’t familiar or maybe they are. And they just, I mean, I want to hear your thoughts in this area.
Jeffrey Brown: [00:34:43.55] Yeah. So, kind of as we, you know, you’re right, we talked a little bit about this before. You know, and a lot of data in my, in my study, from my study speak to more of HR’s protective role of the company rather than the employees, so, you know, there’s that tension, but I don’t think this has to be the case. And, you know, I’ll defer to you, I’m not an HR specialist, but I don’t think that has to be the case. HR does not exist on an island, but it reports to the leadership as you also mentioned earlier. I kind of have a lot to say about this but suffice to say would be that step one would be for HR to really interrogate their own policies to see how, to see how well they lend to intersectionality, how well they lend to accessibility. You know, maybe many HR teams will have already done that. But, you know, I would say, what standards, what standards are you measuring it against? Are you including outside perspectives as well, not only outside perspectives in terms of people outside the HR team, but outside standards in terms of people in different disciplines, people in different teams?
Jeffrey Brown: [00:35:55.73] So I would say HR can do, would do well and I’d say HR teams that, there are probably many teams that are already doing this, but again, stepping outside, stepping outside the HR bubble to, to get expertise from other areas and other teams as well. Other teams, I would say, do this, do this to traditional, I guess, HR teams or traditional HR folks or specialists as well. You know, folks who are, for example, software engineering teams will often look towards folks in HR to help build inclusive practices. And, you know, like in a lot of companies, that is actually standard. So why can’t it be the other way around, too? You know, why can’t companies build that in? But again, I don’t want to put it, this might also sound too lazy, saying, well, it all comes from leadership, but from, from the work that I’ve done, ultimately, it does boil down to leadership and what priority leadership wants to take a focus on in their organization.
Jessica Miller-Merrell: [00:37:02.72] I love, I loved our prep call conversation, and I’m definitely enjoying this conversation because your opinion and experience with HR is, and those that you have interviewed and the research that you’ve done, are very different than my perspective and point of view as somebody who has worked all of their career in human resources. So, I definitely think it’s important for HR leaders, as you said, to step out of the bubble and get perspective and points of view and maybe some pushback. Some feedback that they weren’t necessarily wanting to receive. But it, it is, it’s important because these perspectives matter.
Jeffrey Brown: [00:37:48.33] Yeah. One hundred percent. And I can think off the top of my head about HR teams that are doing this really well. So, I definitely don’t want to say: Oh, HR is the enemy. In fact, the ones that do, do that, do this really well. You know, they’re kind of the heroes in their organization and they’re well-loved. You know, fun fact, Jessica. About 15 years ago, 13 years ago, right out of college, I applied for an HR internship and was a finalist but didn’t get selected. But had I gotten selected, maybe my life would have been different.
Jessica Miller-Merrell: [00:38:18.48] Maybe you would have been on my podcast. Still, you might have just been talking about what it’s like as a Chief HR Officer at your organization.
Jeffrey Brown: [00:38:27.42] Who knows?
Jessica Miller-Merrell: [00:38:29.13] I wanted to ask you a final question. What does a healthy balance of ethical policies in AI look like in your mind for HR?
Jeffrey Brown: [00:38:38.46] So again, I could, you know, maybe write a whole paper on this, you know, co-authored by, with someone in HR. But I guess, suffice to say that policies come from informed database decisions. And you know, when I say database decisions, I don’t necessarily always mean you have to have specific numbers measured using specific instruments over a specific amount of times. Or I don’t necessarily mean you have to have 10 papers from Google Scholar to back this up, although that’s great. You know, data can also be qualitative. Data can also be a series of conversations. Data can also be demonstrated impact of certain, certain policies. So again, they come from informed based decisions. There’s already so much work out there, but again, I want to put in a plug. And again, this is drawing from various disciplines, not just HR, outside of HR, actually, but I want to put in a plug in the AI space for Dr. Timnit Gebru’s work, Dr. Meg Mitchell’s work, Dr. Hanna Wallach’s work. And also, of course, the work that PEAT is doing on accessibility. So again, lots of work. Lots, lots of plugs, lots of work out there. But again, looking outside of the discipline and looking at folks who looked at AI ethics specifically would be a good start.
Jessica Miller-Merrell: [00:40:04.41] Agreed. Well, thank you, Jeff, for taking the time to talk with us on the podcast. You mentioned research. Where’s the best place to go to get up to date on your work in this area?
Jeffrey Brown: [00:40:16.95] Yeah. So, I would go to the Partnership on AI’s website and I can send you a link that you could put in the show notes. It’s Partnership on, PartnershiponAI.org. And there you’ll find a link to this research. We wrote a series of blog posts, essentially which are very public-friendly versions of the research, a series of three blog posts, and our report should be coming out sometime between the next few weeks to a month or so. And yeah, look, looking forward to releasing more research on this as well. Going to turn this into an academic paper as well. So that will come out. And yeah, then we’ll, then we’ll take it from there. We’ll see how we transform this work to do some other things in 2022, but PartnershiponAI.org is a good place to start.
Jessica Miller-Merrell: [00:41:13.07] The resources section is really fantastic, and one of the series that Jeff was talking about is attrition in artificial intelligence, which I think all of us are experiencing, fortunately or unfortunately in our organization. So, check, check out the work in that area because I definitely think there is application that we can use in our industry, regardless of if it is artificial intelligence, hospitality, retail, manufacturing, all, all the, all the industries we can learn from the work that you’re doing. I, and I really appreciate you taking the time to chat with us today.
Jeffrey Brown: [00:41:50.10] Yeah, thank you. Thank you so much.
Closing: [00:41:52.53] Technology can be a bridge or it can be a fence. Artificial intelligence has come a long way in the past decade, and we see it everywhere in HR from career sites, with chatbots, in our automated emails, from our applicant tracking system, candidate assessment tools, those matching tools. Artificial intelligence for HR is the grocery store self-checkout but for HR. As with any other tech, we use this to help make our jobs easier but is important to ensure that the technology aligns with our company culture and our goals around diversity, inclusion, and accessibility. The Workology Podcast is sponsored by Ace the HR Exam and WorkologyCouncil.com. This series that you’re listening to right now is my Future of Work series, and it is powered by PEAT, and I appreciate and enjoy working with them. A special thank you to Jeff and all his insights and expertise. We will include his research, among other information that we shared, an interview with BarkBox that you can check out in the show notes section. So just go to Workology.com And you can take a look at all the resources for this specific episode.
Closing: [00:43:05.54] Personal and professional development is essential for successful HR leaders. Join Upskill HR to access live training, community, and over one hundred on-demand courses for the dynamic leader. HR recert credits available. Visit UpskillHR.com for more.
Closing: This podcast is for the disruptive workplace leader who’s tired of the status quo. My name is Jessica Miller-Merrell, and until next time you can visit Workology.com to listen to all our Workology Podcast episodes.