Remedies Don’t Solve Problems: Regulating AI in Hiring

NEWSLETTER VOLUME 1.17

|

August 31, 2023

Editor's Note

Remedies Don't Solve Problems: Regulating AI in Hiring

 

I am generally in favor of making hiring processes fair and auditing the actual hiring practices to see if they result in bias. 

 

I'm still having trouble with the ways legislatures are approaching the issue. They can't outlaw the use of AI in employment decisions completely. So, instead of understanding what the technology does, how it works, and its role in recruiting and hiring processes, lawmakers do law stuff. Traditional legal tools include: 

 

  • requiring notice and transparency (e.g., background checks, pay disclosures, wage hour posters)  
  • limiting types of information that employers can rely on (e.g., salary question bans and many interview topics like whether someone is married or has kids)   
  • enacting penalties for violations and awarding attorneys' fees if employees prevail. 

 

None of these tools address the realities that: 

 

  • employers have way more power in the hiring process than applicants 
  • the burden is always on the applicant to object and risk rejection or bring a lawsuit when all they really wanted was a job 
  • humans are just as biased as AI; human bias is the reason AI is biased 
  • penalties do not deter discrimination, particularly when the bias is unconscious. 

 

Remedies don't solve problems. 

 

Some of the laws require audits, record keeping, and reporting. This is a much better approach because it creates evidence, forces employers to look at the outcomes of their hiring processes and requires disclosure to agencies who can do something. But even that takes time and resources many states simply don't have. 

 

The designers and sellers of these tools offer easy auditing, alerts about changes in workforce demographics, benchmark data and the ability to see where problems are and find out what's happening. These tools and data allow to find and address issues before they become claims and lawsuits. 

 

Let's require employers to use their data and tools to actually address the problem. Do that and I promise the tools and data will get better.  

 

Give employers incentives to audit and immunity for finding and fixing problems. Let's stop putting enforcement of remedies on applicants and employees. We have the technology. Let's use it for good. 

 

Meanwhile, this is a great summary of current and proposed legislation regulating AI in employment decisions. 

 

- Heather Bussing

 

States’ Increased Policing of Artificial Intelligence in the Workplace Serves as Important Reminder to Employers

by  Katherine Oblak and Yasamin Parsafar

at  Sheppard Mullin Richter & Hampton LLP

Employers’ burgeoning use and reliance upon artificial intelligence has paved the way for an increasing number of states to implement legislation governing its use in employment decisions. Illinois enacted first-of-its-kind legislation regulating the use of artificial intelligence in 2020, and as previously discussed, New York City just recently enacted its own law. In 2023 alone, Massachusetts, Vermont and Washington, D.C. also have proposed legislation on this topic. These legislative guardrails are emblematic of our collective growing use of artificial intelligence, underscore the importance of understanding the legal issues this proliferating technology implicates, and need to keep abreast of the rapidly evolving legislative landscape. Below is a high-level summary of AI-related state legislation and proposals of which employers should be aware. 

Illinois 

In 2020, Illinois Gov. J. B. Pritzker signed into law the Artificial Intelligence Video Interview Act (the “Act”). As previously reported, the Act requires, among other things, employers who use artificial intelligence to analyze video interviews to do the following: 

  • Provide notice: Before an interview, employers must inform applicants that artificial intelligence may be used to analyze the applicant’s video interview and consider the applicant’s fitness for the position. 
  • Provide an explanation: Before an interview, employers must explain to the applicant how their artificial intelligence program works and what characteristics the technology uses to evaluate an applicant’s fitness for the position. 
  • Obtain consent:Before an interview, employers must obtain the applicant’s consent to have the artificial intelligence evaluate them. Employers may not use artificial intelligence to evaluate a video interview without consent. 
  • Maintain confidentiality: Employers will be permitted to share the videos only with persons whose expertise or technology is needed to evaluate the applicant’s fitness for the position. 
  • Destroy copies: Upon the applicant’s request, employers must destroy both the video and all copies thereof within 30 days after such request (and instruct any other persons who have copies of the video to destroy their copies as well). 

The Act leaves many issues unresolved. For instance, the Act itself does not define “artificial intelligence”. Indeed, even with the proliferation of this technology, there is no settled legal definition of the term. The Act similarly is silent as to what kind and level of information is sufficient to meet the statute’s “explanation” requirement. Nor does the Act specify to which employers it applies and to whom it affords protections — or even if there is a private right of action.  

While many of the Act’s nuances remain unsettled, the Illinois legislature has not shied away from tightening the reins on employers’ use of the technology and its myriad capabilities. Illinois passed legislation, effective January 1, 2022, imposing robust reporting requirements on those employers who rely solely on artificial intelligence analysis of video interviews to determine whether to select an applicant for an in-person interview. Under the amendments, such employers must collect and report: 

  • the race and ethnicity of applicants whom the employer does not provide with the opportunity for an in-person interview after the use of artificial intelligence analysis, and 
  • the race and ethnicity of applicants whom the employer hires. 

Employers must report this demographic data to the Department of Commerce and Economic Opportunity annually by December 31 of each calendar year, with the report to include the data collected in the 12-month period ending on November 30 preceding the filing of the report. The Department will analyze the data and report any data disclosing a racial bias to the Governor and General Assembly. Thereafter, the Department will analyze the data reported and report to the Governor and General Assembly by July 1 of each year whether the data discloses a racial bias in the use of artificial intelligence. 

New York 

More recently, effective July 5, 2023, New York City enacted a law even more robust than its Illinois counterpart. The New York City Automated Employment Decision Tools Law (“AEDTL”) prohibits employers and employment agencies from using automated employment decision tools unless: (a) the tool has been subjected to a bias audit within a year of its use or implementation; (b) information about the bias audit is publicly available; and (c) employers provide certain written notices to employees or job candidates. 

Other Proposed Legislation 

The Illinois and New York laws are part of a growing trend to regulate the use of artificial intelligence in the workplace. Massachusetts, Vermont and Washington, D.C. are following in kind and seek to impose their own safeguards. 

Massachusetts 

The Massachusetts Act Preventing a Dystopian Work Environment (H1873), introduced February 16, 2023, requires employers to provide notice to workers prior to adopting an automated decision system. The Act defines ADS as “a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes or assists an employment-related decision.” 

Current status: The bill remains pending in the House before the Joint Committee on Labor and Workforce Development. 

Washington, D.C. 

Similarly, the D.C. Stop Discrimination by Algorithms Act of 2023 (B114), introduced February 2, 2023, prohibits businesses from using algorithms to make “important life opportunities,” including “opportunities to secure employment,” when the algorithm is based on protected characteristics such as race, color, religion, national origin, sex, or disability. 

Current status: On February 10, 2023, the Council published Notice of an Intent to Act in the District of Columbia Register. 

Vermont 

Vermont, too, has proposed similar legislation. Vermont H114, introduced January 25, 2023, restricts the use of automated decision systems for employment-related decisions. The legislation defines ADS as “an algorithm or computational process that is used to make or assist in making employment-related decisions, judgments, or conclusions” and specifically includes artificial intelligence. 

Current status: The House has referred the bill to the Committee on General and Housing, where it remains pending. 

Federal Guidance 

The EEOC, too, has chimed in on the topic. As we previously reported, in May 2023 the EEOC issued guidance on The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees. Among other things, this guidance defines key terms and explains how the use of algorithmic decision-making tools may violate the Americans with Disabilities Act. Significantly, the guidance makes clear an employer cannot insulate itself from liability arising from its use of artificial intelligence by utilizing a third-party vendor to develop and/or administer the tool. 

Takeaways 

The proliferation of guidance and legislation governing employers’ use of artificial intelligence underscores the need for employers to be cognizant of all pertinent laws and remain vigilant if they utilize the technology’s capabilities. 

It's Easy to Get Started

Transform compensation at your organization and get pay right — see how with a personalized demo.