Does Artificial Intelligence Impact Hiring People with Disabilities?

Artificial Intelligence graphic
People with disabilities face significant disadvantages in the workforce. According to the U.S. Equal Employment Opportunity Commission (EEOC), of all the employment discrimination cases filed in 2019, the most common claims involved disability-based discrimination (33.4%), closely followed by race and gender based discrimination. Today, a new form of employment discrimination causes concern: Artificial Intelligence (“AI”) bias.

What is Artificial Intelligence?

Artificial intelligence is a branch of computer science that develops computers and machines to imitate intelligent human behavior. General examples of AI in our daily lives might include “Siri” or “Alexa.” AI is also integrated into assistive technologies such as Seeing AI, AVA, Voiceitt, and smart wheelchairs, just to name a few.

check this out

How is Artificial Intelligence Used in Hiring, and How Does it Impact People with Disabilities?

AI is also widely used in hiring and recruiting for jobs. According to Glassdoor, AI hiring tools are widely used across different industries, from “Allstate to Hilton to Five Guys Burgers.” A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters. To employees, LinkedIn’s AI suggests a job they may be interested in based on their profile and job experience and suggests connections to potential employers as well. Other examples of AI hiring tools include text searching technology that screens high volumes of job applications, facial analysis technology that scans facial expressions and body language of applicants during video interviews, and voice scanning technology that evaluates a job applicant’s speech, tone, and word choices.

However, despite its convenience, AI is also capable of being biased based on race, gender, and disability status, and can be used in ways that exacerbate systemic employment discrimination. For instance, researchers have found that assessing facial movement and voice in applications may “massively discriminate against many people with disabilities that significantly affect facial expression and voice: disabilities such as deafness, blindness, speech disorders, and surviving a stroke.” Also, online personality tests and web-based neuroscience games used in AI hiring tools may screen out people with mental illnesses.

Generally, AI hiring tools are programmed to identify an employer’s preferred traits based on the employer’s existing pool of employees. That means, if disabled people are not represented in the employer’s current pool of employees, then the AI hiring tool may learn to screen out job candidates with a disability. Essentially, AI would treat “underrepresented traits as undesired traits.” As a result, “people with disabilities—like other marginalized groups—risk being excluded,” says Alexandra Givens, president and CEO of the Center for Democracy & Technology. To overcome bias, AI hiring tools need to be trained with more diverse data that includes employees with disabilities. Currently, disabled people are underrepresented in the workforce, and unsurprisingly, technology emulates this phenomenon. “If an algorithm’s training data lacks diversity, it can en-trench existing patterns of exclusion in deeply harmful ways,” Givens wrote in an article for Slate.

check this out

Seeking Solutions through Legal Advocacy

The ADA limits an employer’s ability to make disability related inquiries at the recruiting stage. AI hiring tools that enable employers to gain information regarding an applicant’s disability and screen out qualified candidates would face liability under the ADA as well as state and local human rights laws. According to Bloomberg, the U.S. Equal Employment Opportunity Commission is already investigating at least two potential claims and lawsuits involving an AI tool’s dis-criminatory decisions in hiring, promotion, and other workplace decisions.

State and local governments are proposing and enacting laws that regulate the use of AI hiring tools and scrutinize any discriminatory effects such tools may cause. Illinois has pioneered the AI Video Interview Act which requires employers to notify, explain and obtain consent from job applicants about its use of AI hiring tools. New York City is reviewing a proposed bill that would require sellers of the AI hiring tools to undergo an annual “bias audit.” While we await lawmakers to enact laws to promote AI accountability, advocates will seek action in courts to tackle discrimination arising from AI hiring tools.

Juyoun Han is a lawyer at Eisenberg & Baum LLP who leads the firm’s Artificial Intelligence Fairness & Data Privacy department. As a litigator, Juyoun has represented Deaf and Hard of Hearing clients in courts across the country, advocating for individuals with disabilities to be treated equally at the workplace, hospitals, law enforcement and in prisons.

Patrick Lin is a second year law student at Brooklyn Law School, where he is vice president of Legal Hackers and a staff member of the Brooklyn Law Review. Prior to law school, Patrick worked on data management and regulatory compliance as a technology consultant.

  • Note: Readers should contact their attorney to obtain advice with respect to any particular legal matter. Only your individual attorney can provide assurances that the information contained herein is applicable or appropriate to your particular situation. Use of, and access to any of the contents, links or resources contained within the article do not create an attorney-client relationship between the reader, user, or browser and author or ABILITY Magazine. The views expressed through, this article are those of the individual author writing in their individual capacities only.

DURING THESE UNCERTAIN AND STRESSFUL TIMES, ABILITY Magazine is providing FREE Premium Memberships that include all Content, Digital Flip Page ABILITY Magazine, PDF versions, plus online interactive ABILITY Crossword Puzzles.  SIGN UP HERE FOR YOUR FREE MEMBERSHIP

sharing is caring

we did our part - now do yours and share

like a good neighbor, share

Related Articles: