U.S. Department of Labor Issues Guidance on Avoiding Discrimination When Using AI in Hiring
By: Alexandra Shulman and Leah Lively
AI in hiring: About 80% of U.S. and almost all Fortune 500 companies use AI-powered hiring software. AI may be used to target online advertising for job opportunities and to match candidates to jobs on employment platforms (e.g., LinkedIn, Indeed). AI may also be used to reject or rank applicants using automated resume screening and chatbots based on knockout questions, keyword requirements, or specific qualifications or characteristics.
With the growing use of AI comes a growing concern by the government (and argument by plaintiffs) that AI tools present a risk of worsening workplace discrimination based on race, gender, disability, and other protected characteristics. AI tools are trained on vast amounts of data and make predictions based on patterns and correlations within that data. However, many of the tools used by employers are trained on data from the employer’s own workforce and previous hiring practices, which, it is argued, may reflect institutional and systemic biases already present in the organization.
The Department of Labor responds to AI: If your company uses (or is thinking of using) AI in hiring, you need to be aware of the U.S. Department of Labor’s (“DOL”) recently issued “AI & Inclusive Hiring Framework.” The Framework is designed to “help organizations advance their inclusive hiring policies and programs, specifically for people with disabilities, while managing the risks associated with deploying AI hiring technology.” The Framework was published by the Partnership on Employment & Accessible Technology, which is funded by the DOL’s Office of Disability Employment Policy.
The AI Framework includes ten focus areas designed to address five overarching themes:
1. Impact of procuring AI hiring technology
Employers utilizing AI in hiring technology should consider its impact on their DEIA (diversity, equity, inclusion, and accessibility) initiatives.
2. Advertising employment opportunities and recruiting inclusively
Employers utilizing AI in hiring should continue to consider the rights and user experiences of job seekers with disabilities and members of other protected classes.
3. Providing reasonable accommodations to job seekers
Employers must continue to provide reasonable accommodations to applicants and employees.
4. Selecting candidates and making employment offers responsibly
Utilizing AI in hiring does not absolve employers of the responsibility of ensuring that they are complying with applicable federal, state, and local laws in hiring.
5. Incorporating human assistance and minimizing risk
Employers should develop human oversight policies to address possible AI errors.
What this means for employers: Employers must continue to be mindful of anti-discrimination laws as they begin to integrate AI into the workplace. Employers should take the time to evaluate their AI practices and ensure that proper safeguards are in place to identify and rectify any discriminatory impact. This can be done through:
- Conducting AI audits. Employers need to know when and how AI is used in the hiring process. Employers should audit the AI tools and algorithms they use in hiring to identify potential bias or discrimination. This can be achieved by having third-party experts, like employment counsel, evaluate the data inputs and outputs.
- Ongoing monitoring. AI hiring bias compliance cannot be a one-time effort. Companies must implement ongoing monitoring programs to regularly reassess their tools as models are updated and new data is incorporated. To ensure full transparency, feedback loops that provide detailed hiring outcomes are essential.
- Do not forget the human factor. AI should not be allowed to make final hiring decisions autonomously, as its effectiveness depends on the quality of the data it processes. We recommend that companies equip their HR teams/hiring managers with technical knowledge of AI systems to better manage and evaluate their use.
If you have questions about using AI in the employment process, or would like additional information about compliance with the DOL’s new Framework, we are here to help.
This communication is not intended to create or constitute, nor does it create or constitute, an attorney-client or any other legal relationship. No statement in this communication constitutes legal advice nor should any communication herein be construed, relied upon, or interpreted as legal advice. This communication is for general information purposes only regarding recent legal developments of interest, and is not a substitute for legal counsel on any subject matter. No reader should act or refrain from acting on the basis of any information included herein without seeking appropriate legal advice on the particular facts and circumstances affecting that reader. For more information, visit www.buchalter.com.
Link to article