Episode 7

Would Stephen Hawking get hired today? The hidden bias in AI recruiting tools, with Susan Scott-Parker

Published on: 3rd April, 2025

Could AI hiring create more barriers for people with disabilities instead of levelling the playing field? In this episode, Susan Scott-Parker, founder of business disability international, says standardised and inflexible AI hiring systems risk shutting many people with disabilities out of the workforce. She makes the case for more inclusive HR technologies that are designed for the full range of human experience. 

Tune in to learn more about:

  • The unsettling truth about how AI hiring tools are screening out candidates with disabilities - and how to make them fairer
  • How HR leaders can challenge biases in AI hiring tools and demand more from the technology they invest in
  • Why Susan coined the term “disability confidence” and why it’s not just about “being nice” to disabled people

Missed last week's episode? REPLAY: Cliff Weitzman on building the 'voice of the internet'

---

About Susan Scott-Parker

Susan Scott-Parker OBE is a creative thought leader internationally recognised for her work on how to mobilise business leadership behind disability equality. She founded the world’s first business disability network, now the Business Disability Forum (UK). In 2016, she established business disability international and advises a growing global community on how to work productively with businesses as valued allies.

Follow Susan Scott-Parker on LinkedIn: https://www.linkedin.com/in/susanscottparker1/

Learn more about business disability international: https://www.businessdisabilityinternational.org/

Learn more about Project Nemo: https://projectnemo.co.uk/

Follow Amit Ghose: https://www.tiktok.com/@amitghosenf1

---

Connect with  Made for Us

Transcript
SSP:

Most of the HR technology is grounded in data relating to the workforce, people in work. And so the HR people are out there trying to find candidates who actually resemble as closely as possible people they already have. Well, they don't have people with disabilities that they're aware of in their workforce in the same numbers as they have non-disabled. So the data is always biased.

TS:

Welcome to Made For Us, the show where we explore how intentional design can help build a world that works better for everyone. I'm your host, Tosin Sulaiman. You may have heard the story about the hiring algorithm accused of sexism. At least that's how some headlines framed it.

A major tech company built an experimental hiring tool to scan online resumes and identify the most promising candidates. But soon it became clear that the algorithm was biased against women. It downgraded candidates who attended all women's colleges, for example. The reason? The tool had been trained on resumes from existing employees, most of whom were men. The company ultimately scrapped the project and disbanded the team behind it.

TS:

But hiring tools like this haven't disappeared. And my guest, Susan Scott Parker, the founder of Business Disability International, says the risks of discrimination go beyond gender. Imagine replacing men with non-disabled people and women with people with disabilities in the story I just told. In our conversation, Susan makes the case for more inclusive HR technologies that are designed for the full range of human experience.

And she argues that developers should be required to prove their tools are safe for disabled and other disadvantaged groups before they hit the market. A quick note, this is a topic we'll be coming back to next week when I'll be speaking to Ariana Aboulafia from the Center for Democracy and Technology about her research into the disability data gap. So be sure to tune in for that. Now here's my conversation with Susan Scott Parker.

SSP:

I was the founder, am the founder of the first business disability network, the national one in the UK. And I am working as strategic advisor to the United Nations Global Business Disability Network at the ILO, to Valuable 500, to Purple Space, to all sorts of really interesting international organizations mobilizing business as potential allies of people with disabilities.

TS:

So if we could go back to the start, how you first got involved in this space, can you tell us a little bit about that story? What was the motivation?

SSP:

It goes back so far. It's a question I'm not very good at answering. But I suppose I have a very early memory. I'm a Canadian and I'm teaching swimming when I'm in high school. And the guys at the YMCA tell me that one, they want me to teach a class of kids who have disabilities. I say that's terrific. And then they announced with great pride that this was the first time they'd ever allowed children with disabilities into the swimming pool at the same time as non-disabled children. And so...

Even when I was 16, that struck me as very odd. I guess I would say, even when I was 16, Queen Victoria was not on the throne. That is not that long ago. And there was still this implicit understanding that somehow our worlds were so different that somehow these children would never be allowed to swim in water with non-disabled kids. So I started to meet some of those young people and I guess it went from there.

TS:

So how did that initial experience lead you to what you're doing now? What was the journey?

SSP:

Well, I suppose it caused me to be curious as to why such a highly regarded institution could get it so fundamentally wrong about something that seemed to be so fundamentally basic in terms of how we treat each other and the reality of what it means to be human, because having a disability is just part and parcel of what it means to be a human being. And so I looked at business in the community in the UK, this extraordinary organization that over now more than 20 years with Prince now King Charles as president has brought companies together to collaboratively look at the relationship with the communities in which they do business to understand that healthy communities are good for business and mobilizing senior business leaders in a way which in terms of enlightened self interest cause them to invest in very different ways in the markets and communities in which they operate.

So I just stole the model, I stole the idea really. And I said to myself, what if a group of companies were to come together and joint fund a small expert team that made it easier for them to get it right as both employers and providers of goods and services in a way that then benefited people with disabilities to try to bridge the divide between disabled people and business.

TS:

And so that was the Business Disability Network.

SSP:

That's the UK's business disability network. That's right.

TS:

TS 05:08

And how did that evolve into Business Disability International?

SSP:

Well, as we were working with large private sector organizations, we would be talking about what best practice required. For example, making sure all your websites were accessible to people looking for jobs, wanting to buy stuff. And we'd hear things like, well, that's controlled out of New York. There's nothing we could do in the UK about that. So the board in the UK decided we should approach some organizations to help us create something that could operate more effectively at global level. And we ended up with three founders, Barclays, Infosys and GlaxoSmithKline, GSK, the pharmaceutical, who helped us to develop a new model, if you like, for working with global HQs in such a way that it became easier for leaders at national level to make progress. And so now we're working with existing networks like Valuable 500 and the ILO. And so it's evolved more into an advisory trusted advisor and mentoring role.

TS:

And you first coined the term disability confidence. I'm curious to know how that came about. What was the inspiration?

SSP:

While I was sitting in my office in London, I'm Canadian, but of course I've been in London for a long time, and I was getting really tired of the fact that in most places you went to, the first conversation with a business leader was, why don't you hire more disabled people? That's a very short chat. The answer is, well, they don't apply in most places. And you're talking to the wrong person, that's probably HR. And so I thought what we were really trying to do was to capture the interest of senior leaders in the fact that if they improve their disability performance, they deliver business improvements. Because if you can learn to recruit people with disabilities on a fair and equal basis, even a Canadian woman might be able to get through, right? Because you get less distracted by stuff that's got nothing to do with whether or not they could do the job if you were clever and a bit flexible.

But of course, it's also true that in the initial conversations with so many people, and not just business people, there's an immediate unease. Are we even allowed to say the word disability? Isn't that kind of an insult or something? You need the confidence at a personal level. So I wandered around going, disability confidence, can we define it? Yes. Can we measure it? Yes. And then of course, to my surprise actually, it really has taken off. I would just say though, I'm a little anxious that it has lost the equality piece as a definition.

It doesn't just mean that you're nice to disabled people. It means you actually deliver disability equality in a meaningful way. And so I think we need to come back to that and try to get that message across more effectively.

TS:

How do you define disability confidence and how do you measure it?

SSP:

Well, it's got four components. You understand the impact of disability on the business. So you know, for example, that one in three of your customers aged over 50, at least one in three will have a disability. And that if you're in tourism, you know that customers over 50 are the biggest spenders. So you put that together. And that's why in Botswana, you're seeing an increasing interest even on Safari in vehicles that let wheelchair using high spenders get onto those safaris. You understand the percentage of students in your local area that have a disability and you're looking at talent pipelines and how you bring them in. So there's that basic understanding, which I suppose could translate to competencies. You are barrier free for groups. So this is the accessibility stuff. And so much work being done over the years in encouraging procurement to only buy accessible technology. Still a lot of work to be done there as they're beginning to learn that it costs more to fix it later than it would have cost requiring it to be accessible when you bought it.

And you make adjustments for individuals. And I think it's really important now that you meet the expectations or the standards set by the ILO labor standards. So I would be looking for things like, you pay equal pay for equal work? And of course, America, I'm afraid, has legislation which permits an employer of people with disabilities not to pay the equal wage, not to pay the minimum wage. And so it's something you can quantify.

TS:

I think this is a good time to talk about your recent work, which is focused on the intersection of business, disability, and AI, in particular the potential for bias in AI recruitment tools. So I understand you've come across lots of stories and examples from people impacted. Can you give a sense of what you found?

SSP:

Well, this is a classic market failure where we've got HR people who don't understand disability discrimination buying products developed by AI developers who don't understand disability discrimination. And where the developers then argue, our customers don't care, why should we? So there are a number of issues here. Most of the HR technology is grounded in data relating to the workforce, people in work.

And so the HR people are out there trying to find candidates who actually resemble as closely as possible people they already have. Well, they don't have people with disabilities that they're aware of in their workforce in the same numbers as they have non-disabled because disabled people are always at least twice as likely to be unemployed as anybody else. So the data is always biased on the disability front. That's a given and we need to make that really clear.

SSP:

However, it's more than just biased data, because what we also have is developers who advertise that their product has removed human bias because it's in a standardized process. So that's like saying that the next Stephen Hawking has to go to an interview room up two flights of stairs because everybody does. We're supposed to know that you have to make adjustments so that people can demonstrate what they can do fairly. So you would, you know, you'd have the interview down at the bottom of the stairs.

Well, AI are the new stairs. I mean, we see things like someone who's stammer is being told that he's got three minutes exactly to get through that video interview or they discard him. He takes three minutes 15 because he stammers or, you know, looking at eye contact into the camera, but I'm visually impaired and one of my friends says my eyes dance, her eyes are all over the place. Well, she's not going to get through.

You've got people with hearing loss whose voice is unusual, voice patterns, sometimes it goes up when most people's voices would be steady, well, they're gonna get discarded because it's trained on certain voice patterns. The problem here is that the processes that they drop these tools into, because they're standardized, they don't actually tell the job seeker, you're gonna have to go through a three minute this, and then you're going to be interviewed by a robot, and you rely on people letting you lip read through the interview.

SSP:

So you can't ask for adjustments because you don't really know what's coming anyway. You've no idea where AI is and is not been given the right to throw you out because of course you've no way of knowing you were discarded because of an algorithm. So how on earth do you take that individual to court in countries where you do have some rights to do so? And so the process itself, because it's rigid, is problematic.

And then of course you've got these some of these products that claim to read your personality by a 60 second piece to camera. The one that I'm thinking of in particular was tested by a German broadcaster and they found, though it had nothing to with disability, that when the actress is looking at the camera being interviewed for her 60 seconds, she gets her personality profile sent to the recruiter. This actress did exactly the same interview, same face, same voice, same words, but she put on glasses and her personality deteriorated by 10 points. And when she did it again, same face, same words, same everything, trained actress, her personality improved by 20 points because the camera spotted an oil painting on the wall behind her.

SSP:

Now imagine if it had spotted a wheelchair or if she can't smile because she has a paralysis of the face or she's wearing some unusual hearing aid or she's got a serious birthmark that would be regarded as facial disfigured. All kinds of reasons why a camera that thinks wearing glasses cuts your personality profile makes you more neurotic. I'm sorry, I have to laugh because I was looking at the scores and you are 20 % less likely to be neurotic if you've got an oil painting behind you. I mean, this is serious stuff. Does a recruiter want to hire neurotic people? No. So you can just knock them out by using this tool on the strength of a 60 second glimpse of you on camera. So the science that underpins it is deeply flawed. When people are talking about bias on grounds of race and gender. Why is disability not on that agenda?

TS:

There's so many different aspects to this. If we could go back to this idea of bias and the fact that these AI systems are aiming to remove bias. Obviously we know that human beings don't have a great track record either when it comes to bias and recruiting.

SSP:

My response instantly is you might get lucky when you're applying for a job and meet an open-minded recruiter, but when it's bedded into an AI-powered system, it's every time. You have no chance of getting through, but you might get lucky with human beings. So at least the randomness of that is in your favor. So I would take random human bias any day over unavoidable, inevitable, built-in discrimination. And the word bias is a bit of a disguise word here, because the bias leads to discrimination, unfair treatment, people being denied the chance to work for the rest of their lives. Bias is sort of a sweet little word in a way, right? Oh well, we're all biased, unconscious bias training. But the point is, discrimination is the key word here, unfair discrimination.

SSP:

So actually my involvement in this issue stems from reading an article where they were pointing out some of the problems that people were encountering getting through video interviews because they were black. And I thought, well, what if they were getting a video interview and they had a facial disfigurement? Would the technology recognize their face? I had that, my God, this looks really scary moment. And yet I read a piece about a young man trying to get a job just with Deliveroo.

Deliveroo required him to take his phone, take photographs of his face to submit with his application. The AI didn't recognize that he had a face because he's facial disfigured. So he can't even apply for a job delivering groceries on his motorbike.

TS:

The man Susan is referring to is Amit Ghose. Amit applied for a job as a delivery rider in 2023 to earn some extra cash to fund his wedding. But when he tried to verify his identity, the facial recognition software didn't recognize his face. He told his local paper the Birmingham Mail, “It recognised my head turns but when I had to blink and open my mouth it rejected me. It left me disheartened and frustrated. Eventually they apologised and they unlocked it for me.”

A Deliveroo spokesperson told the paper at the time, “We are sorry for the experience this rider had verifying his identity. As soon as this was brought to our attention, we reviewed what had happened and put an approach in place to resolve the issue.” Deliveroo didn't respond to a request for comment on this episode.

Amit has run into the same issue with AI tools in other areas, not just recruitment. Here he is in an interview with Project Nemo, a campaign group focused on disability inclusion in fintech, where he describes his experience with facial recognition software in banking apps.

AG:

I've had 24 surgeries on my face alone. My condition is called neurofibromatosis type 1. It's more commonly known as NF1. The nature of the condition is that it causes tumors to grow on nerves. For me, those tumors grew on my face, making me look visibly different.

So when I was 11 years old, the tumors were growing quite rapidly. They started to grow on my left eyelid, which resulted in me having to have my left eye surgically removed. That affected me by leaving only vision in one eye and having this prosthesis now. So it took me a long time to accept who I was. You know, for a very long time, I hid the left hand side of my face. Almost over 30 years, it took me to accept myself.

My wife encouraged me to share my story with the world. One video on TikTok went viral. I got 20 million views. I couldn't believe it. What I wish others knew about me was that I'm just another human being. I think my friends have had a huge impact on me. Just that normalization of things. Just that thing of, you're just one of us. Using facial recognition to make payments online via banking apps have been a challenge for me. The banking apps will not recognize my face and consistently say, face not recognized, please try again. Face not recognized, please try again. And that makes me feel as if I'm not a human.

AG:

Greater accessibility and inclusion for people with disability would mean that we feel included. At the moment, it doesn't feel like that.

TS:

This industry is growing so rapidly and the developers of these tools and the companies that buy them, you know, argue that it saves time and money. helps them find the best candidates faster, more efficiently. It's hard for them to say no to a tool that dramatically cuts down the time to hire a new employee, for example. I guess this is what you're up against.

SSP:

Absolutely. I mean, the key task now for recruiters is to discard as many candidates as possible, as cheaply as possible, to try to miraculously find the few that are worthy of human attention. And so the tools are hugely attractive. I don't think many of them realize that this will discriminate not just against people with disabilities, but inevitably a much wider group as well. The examples I give are examples of how humans differ. So around the world, there's 50 million people who stutter stammer, right? And so we've got one in three people aged 50 to 65 with disabilities. You've got a huge percentage of individuals with health conditions that are actually gonna have an impact on how they get through these systems.

Many people with facial disfigurement just regard themselves as having a different face. Well, lots of people have different faces. Imagine how many of them are struggling to get through. So the challenge for the best practice HR director is not to just have one route in. You need to give people more than one way of applying and not to allow without scrutiny people to be discarded regardless of whether or not they could do the job, which means they have to ensure that the systems are designed so that the individual can request an adjustment to the system.

SSP:

You still have to say at the very front, this is what's coming. If you need an adjustment, let us know. It is a legal obligation in many places because it's online doesn't mean it's somehow ethically neutral. And they need to put pressure on their developers who are very clever people in their own way to say, we need you to design systems that are flexible that adapt for human reality. They're the ones with the purchasing power.

TS:

And you did touch on this earlier, this issue of bias is well known within the AI community, particularly the responsible AI community, especially when it comes to gender and race. But you you've said people with disabilities are missing from that conversation. Why do you think that is?

SSP:

Well, even just looking at gender, we know that one in five women worldwide will have a disability. So even when they're talking about gender bias, they only mean some women, the 80 % who are being biased, who experience this discrimination on grounds of gender. Who knows how many of them are actually first knocked out because of their visual impairment, hearing impairment, whatever.

Why do we not want to talk about disability? Isn't it an interesting, nice, big, chunky question? In the short term, it's too complicated and there's a mixture of distaste. And so I think the developers hide behind the 80-20 thing. Well, you know, we deal with 80 % and that's enough, rather than realizing that systems that work for extreme users work better for everyone. And so that basic insight into design is missing. But I think people are uncomfortable with the word, which is why we keep coming up with these euphemisms. My favorite is handy capable.

SSP:

We just don't want to look it in the eye and say people with disabilities, disabled people, as many prefer to be called. If we live to be 70, every one of us will have at least 10 years of disability as part of the experience of being alive. I think it's distaste, fear. I think there's a little bit. I wonder if this is true. Is there a thought that if you can't get through my online recruitment process, well, we didn't want you anyway.

You know, it's a kind of, we'll make you jump through all these hoops and if you can't get through the hoops, well, it's one way of quickly screening you out anyway. It's not our job to make the process barrier free. The more barriers, well, the fewer get through and those are the ones, the winners that we want. I don't know. But most of it I think is a kind of deep rooted assumption that disabled people are not valid.

TS:

And another point you made earlier was how does someone prove they were denied a job because of a biased algorithm? People often never find out why they're rejected.

SSP:

Actually, we've only seen one case coming through in the States, an individual who applied for 800 jobs managed through a particular AI powered HR platform and didn't get anywhere with any of them. And his case, as he's taking it through the courts, is that he was discriminated against by that AI tool on grounds of his race and disability. Oh, and age. I think he's also arguing that the system knocks out people on the basis of age. And of course, age and disability are closely correlated. It is extremely difficult.

What we need is to amplify, if you like, the authority or the focus of the equal opportunities related, equality related bodies that regulate the obligation to make adjustments with more consumer based protection legislation so that any AI developer would have to prove his product was safe before he could put it on the market. We expect that when they put aspirin on the market, they have to put something on the package that says actually these are the potential harms, the potential side effects. So from my perspective, we're pushing strongly for developers of these HR tools to have to put on the pack the groups of human beings that were not taken into account, not consulted, not part of their risk assessment process.

SSP:

So that an HR director can see that if they've got this package, no one with a facial disfigurement, with a visual impairment, with a hearing impairment, et cetera, et cetera, we're not involved. And so they are potentially at risk. And that then at least focuses the HR director on, I'm prepared to accept that risk. We will discriminate against these people unless we're very careful.

And of course, ideally, the regulators would then be able to look at all the packages and say, why on earth do these guys still put on the market products that are potentially discriminatory against so many groups? That would be really helpful.

TS:

So we've talked about what the developers should do. We've talked about what the HR buyers should do. What else is on your list of recommendations?

SSP:

Well, I suppose the key is that we have to find a way to capture the stories of these real human beings trying to get through these systems. And so when I say someone who's stammer is needing an extra 15 seconds, people go, of course, I never thought of that. And so it's the, of course moment that we're trying to get across here. We had some wonderful volunteer help to get our website up, disabilityethicalai.org where we're trying to put some examples of the stories, some visuals, just to bring the story alive. So if anyone who's listening to this has stories to tell, that would make it harder to ignore this issue. That would be great.

And of course, my challenge to the AI developers is you keep developing tools for the recruiters. Where are the tools that would enable an individual to document their experience as they're trying to get through hundreds of these processes, hundreds of companies, and turn that data into information which HR and the regulators can then use. And so that would be fantastic. So let's hope that we get some of those big brains focused on individual human beings trying to just make their way through life, get a job, buy a mortgage, feed their family. It’s that basic.

TS:

And how can people learn more and how can they follow your work?

SSP:

Well, follow me on LinkedIn, Susan Scott Parker, and as I say, our website, disabilityethicalai.org. And get in touch if you've got examples and stories to tell where it's working great or where it's not working so that we can make this a human story and bring it to life.

TS:

It's been a pleasure having you on the show. Thank you so much for your time.

SSP:

Thank you very much.

TS:

Thank you to Susan Scott Parker of Business Disability International. Thanks also to Project Nemo and Amit Ghose for giving us permission to use the interview you heard earlier. You can learn more at projectnemo.co.uk and follow Amit on TikTok. I've included the links in the show notes.

If you learned something new in this episode, please share it with someone who'd appreciate it too. And remember you can find us on LinkedIn and Instagram at madeforuspodcast. I'm Tosin Sulaiman. Thanks for joining me on Made For Us.

All Episodes Previous Episode
Show artwork for Made For Us

About the Podcast

Made For Us
Innovating for inclusion
Made For Us is an award-winning podcast for anyone who’s curious about how to design for inclusivity. Join us each week for conversations with founders, designers, product inclusion leaders and other creative minds who are challening the status quo of how everyday products are designed. Each episode will bring you insights from people who've spent years thinking, perhaps even obsessing, about how to develop products or build companies that are inclusive from the start.

AWARDS

2024 Signal Awards:

Bronze winner: Most Inspirational Podcast

2024 International Women's Podcast Awards:

Finalist: Moment of Insight from a Role Model for 'Reflections on creating the headscarf emoji, with Rayouf Alhumedhi

Finalist: Moment of Visionary Leadership for 'No going back': lessons from P&G's product inclusion journey, with Sam Latif'