Top 6 Takeaways to Reduce AI Hiring Bias

July 4, 2023 | Jenni Gray

Artificial intelligence (AI) has worked its way into many facets of our lives, and talent acquisition/management is no exception.

As of 2022, four of every five employers use AI-enabled and automated tools in their key employment activities, including recruitment and hiring. Nearly 40% use AI-enabled tools for performance management, according to a SHRM study.

However, sometimes the use of AI can result in a disparate impact, which is an unintended consequence that negatively impacts a group of people (particularly protected classes such as race, ethnicity, sex, disability, etc.).

We hosted a roundtable to discuss the current regulatory landscape regarding the use of AI in human resources, how to assess potential bias in your hiring process with an AI audit, and how Cisive does or doesn’t use AI currently.

Here are our top six takeaways:

 

1. Regulations surrounding the use of AI are already here in Europe.

The European Parliament has already passed the AI Act. It’s not law yet, but when it is, it will treat AI tools used for employment as high risk. “You’re allowed to use it, but those systems have to be registered in an EU database and they have to be assessed both before use and throughout their life cycle,” said Margo Pave, Director of the Human Capital Strategy Group at Resolution Economics, LLC.

 

2. In the United States, federal regulations are coming soon.

In the U.S., a blueprint for an “AI Bill of Rights” was introduced by the White House in late 2022 to “protect individuals and communities from algorithmic discrimination and to use and design systems in an equitable way.” There is no federal legislation yet, but there likely will be soon. However, the Equal Employment Opportunity Commission has taken a firm stance that anti-discrimination laws still apply despite new technologies utilized by employers.

 

3. State and local governments in the U.S. are already enacting legislation.

As of July 5th, New York City is the first state to enact a law specific to “Automated Employment Decision Tools” (AEDT). It requires a bias audit of an AEDT each year, conducted by an independent auditor, with results posted publicly. Employers are also required to provide a notice to applicants that they utilize an AEDT, including which job qualifications and characteristics are used in assessing a candidate, the type/source of data collected, and AEDT data retention policy. Illinois and Maryland have also passed laws related to the use of AI in video interviews.

 

4. AI bias audits help you stay compliant.

A bias audit is a review of your AI tools conducted by an independent auditor to ensure your tool isn’t resulting in bias within your hiring process. As federal, state, and local governments pass legislation surrounding the use of AI in the hiring process, preparing for an AI bias audit now will set you up for success later. (Check out our presentation at the 34:30-minute mark to learn more about how bias audits work.)

 

5. It’s about finding balance.

Background screening is not meant to be an automated employment decision tool,” said Alan Gordon, Chief Information Officer at Cisive. “There’s nothing about the way we implement background screening at Cisive that makes the decision process automated.” Cisive believes in using AI as an assist to humans, and that candidates should receive an individualized assessment rather than an automated decision.

 

6. Background screeners will be closely evaluated.

As these new AI laws are enforced, consumer reporting agencies like Cisive will be inspected closely to evaluate what automated tools are used and how they may or may not contribute to disparate impact. Cisive watches legislation closely to adhere to all regulations and has never been the subject of an enforcement action or class-action lawsuit, minimizing clients’ risk.

Cisive works to be as efficient as possible, while remaining fair to candidates in the hiring process. With AI tools such as image processing that validate information on uploaded documents (such as W-2s, diplomas, etc.), we save our investigators and our clients a lot of time, and a human investigator still writes up the education or employment verification. Our platform ensures a human interaction by the client is required to make a decision on whether adverse action should be taken on a candidate or not.

Stay alert as new regulations regarding the use of AI in hiring continue to pass through legislatures. With an AI Bias Audit, you can be certain your hiring process is free of disparate impact.

Want to learn more? Watch the complete presentation here.

 

Supported By WordPress Database Support Services

Subscribe to the Cisive Newsletter