NYC-crackdown1000

Implications for Tech Staffing

Nationwide Implications for New York City’s AI Bias Audit Law

Can machine learning, statistical modeling, and data analytics combine to fairly screen job applicants?

by Henry Lenard 

 

 

The law is very vague as to what the bias audit is supposed to examine, and how it is expected to account for all potential jobs and job classes for which a company might hire in the upcoming year.

—Jim Paretti, Littler’s Workplace Policy Institute

interview800x551

 

 

New York City-based Surveillance Technology Oversight Project (S.T.O.P.) said the new law on algorithmic bias will “rubber-stamp discrimination” by enabling more biased AI software.

Is This the Beginning of "Womb-to-Tomb" Algorithms!?

Automated Employment Decision Tools (AEDT) New York City’s first-of-its-kind law requiring employers to conduct annual independent bias audits of automated employment decision tools (AEDT) used in the hiring and employee promotion process is expected to be disruptive and reverberate nationwide.

An AEDT uses machine learning, statistical modeling, data analytics or artificial intelligence to screen job applicants. It generates a simplified output such as a score, classification or recommendation to assist or replace discretionary decision-making. The bias audit is intended to analyze the AEDT for possible discrimination against candidates by race or gender.

“This is a very complicated issue. Algorithms have an inherent bias,” said James A. Paretti, Jr., a shareholder in Littler’s Washington, D.C. office who has been closely following this issue. Littler is the world’s largest employment and labor law practice representing management.

The law, passed by New York City Council in November, takes effect on January 1, 2023. It bars employers and employment agencies from using AEDTs for the screening of job applicants or employee promotions unless those tools have been independently audited for bias no more than one year prior to its use. Thus, employers wishing to continue or begin using AEDTs next January should already be conducting their bias audits.

A summary of that audit must be made publicly available on the company or employment agency’s website, as well as the distribution date of the AEDT. 

According to Paretti, the law is very vague as to what the bias audit is supposed to examine, and how it is expected to account for all potential jobs and job classes for which a company might hire in the upcoming year. It also does not explain how its findings are to be utilized, and whether the AEDT must pass such audit, and if so, what are the passing criteria.

In addition, the law requires that employers or employment agencies provide notice to job candidates no less than 10 business days before using an AEDT that such a tool will be deployed for reviewing their application.

“The 10-day audit bias notice requirement impacts contingent workforces and staffing firms. The job could already be filled in that time,” said Paretti.

Employers and agencies must also specify what job qualifications and characteristics will be used. The individual has the right to request an alternative selection process, such as a personal interview. The law establishes civil penalties of $500 to $1,500 for each violation of any of these requirements.

The local law only applies to applicants or employees who are residents of New York City but is expected to be a model for similar legislation across the country because of the city’s size.

With a population of 8.8 million, New York City is larger than 39 of America’s 50 states. The city is the headquarters for 54 companies on the Fortune 500 list of America’s largest companies. According to the New York State Department of Labor, the city now employs a record high 4.6 million workers, adding approximately 90,000 jobs a year.

For a global comparison, the Tokyo Bureau of Industry & Labor reports its workforce numbers 7.9 million people, while Eurostat shows 5 million workers in the Paris region and 4.9 million workers in Greater London.

While employers need to be aware of this new law, it is raising many questions.

“The New York law doesn’t do a particularly good job in defining terms of AI algorithmic screening tools. It is a somewhat muddled definition,” said Paretti.

The law requires applicants be notified the specific job qualifications and characteristics that the AEDT will be using in making its assessment. This aspect is troublesome, said Paretti, because the AI used is designed to be constantly evolving.

“If you take a paper test, it will always be the same test with the same right and wrong answers. With an AI algorithm, the test will be changing constantly. It could be an entirely different test,” said Paretti in explaining the challenges of auditing an AEDT test.

What are the implications for hiring technical staff?

Paretti said the complexity of the continuous learning process of AEDTs means the programs typically do not retain a snapshot or other record of each step in its evolution. Its programmers can’t fill the gap because these tools are designed to evolve in a self-determined manner without the need for human intervention.

He added that he anticipates challenges being filed when the new regulations are implemented. 

Employer use of AEDTs to sort through the myriad of job applications has been soaring. A recent Global Talent Trends survey of more than 7,300 human resource managers worldwide by asset management firm Mercer found that the use of predictive analytics in hiring rose from 10 percent in 2016 to 39 percent by 2020.

Like other AI applications, researchers have found that such hiring tools often have inherent bias, favoring men or screening out individuals from certain socioeconomic backgrounds.

Amazon, for example, abandoned its AI recruitment software several years ago after it was determined to be biased against women by not rating candidates for software developer and other technical positions in a gender-neutral way.

The company’s computer models were programmed to vet applicants by observing patterns in resumes submitted over a 10-year period. Most of the resumes were from men, reflecting the male dominance in the industry. Thus, the system taught itself that male candidates were preferable, downgrading resumes that included the word “women”, reportedly going so far as to penalize graduates of all-women’s colleges.

A well-known 2020 study from the MIT Sloan School of Management found that how a hiring algorithm is designed impacts the quality and diversity of candidates.

As explained by MIT Sloan Professor Danielle Li, typical hiring algorithms are designed to solve a static prediction problem. They look at a historical data set of people who were previously selected to predict who will be a good hire from a pool of current applicants.

“As a result, those algorithms often end up providing a leg-up to people from groups who have traditionally been successful and grant fewer opportunities to minorities and women,” said Li.

“Algorithms often end up providing a leg-up to people from groups who have traditionally been successful and grant fewer opportunities to minorities and women.”

In anticipation of the New York City law, several AEDT vendors have done third-party audits of themselves. In 2020, New York City-based Pymetrics, one of the largest such AEDT companies, retained Northeastern University to audit its hiring algorithm. Pymetrics paid the school nearly $105,000 to have a team of computer scientists review its AEDT for compliance with proposed New York City law.

Northeastern based its definition of fairness on so-called “four-fifths rule” of the U.S. Equal Employment Opportunity Commission (EEOC), which is that hiring procedures should select roughly the same proportion of men and women. Under the rule, if men were passing 100% of the time to the next step in the hiring process, women need to advance at least 80% of the time.

The Northeastern auditors found Pymetrics’s system met the four-fifths rule.  

Pymetrics is not the only AEDT vendor having its AI audited. Utah-based HireVue hired O’Neil Risk Consulting and Algorithmic Auditing (ORCAA) to evaluate one of its algorithms. Rather than evaluating the AEDT’s technical design, it interviewed job applicants, an AI ethicist and several non-profit organizations about potential problems with the HireVue tool and gave the company improvement recommendations. The final report is published on HireVue’s website.  Download Algorithmic Audit Description – O’Neil Risk Consulting & Algorithmic Auditing | HireVue

In December, several of the world’s largest employers formed the Data & Trust Alliance to advance algorithmic safety built on responsible data and AI practices. The group was put together by former chief executives of American Express and IBM. Its two dozen members include such companies as General Motors, Walmart, CVS Health, Meta, Deloitte and Mastercard.

The Data & Trust’s first initiative will be adopting criteria to mitigate data and algorithmic bias in human resources and workforce decisions, including recruiting, compensation and employee development.

The group has developed a 55-point questionnaire that attempts to identify discrimination in resume-screening and other algorithms companies are using in employment. That is a major concern because of legal exposure if bias is determined to exist.

Paretti is supportive of that effort. “An industry standard is what is necessary. Industry has a better understanding of AI algorithms than government,” he said.

While more employers are turning to AEDTs, so are lawmakers and regulators. General artificial intelligence bills or resolutions were introduced in at least 17 states in 2021.

In the past two years, laws have been enacted in Illinois and Maryland regulating how employers can use applicant data generated through AEDTs.  

In Illinois, for example, the Artificial Intelligence Video Act took effect on January 1, 2020 which regulates employers use of algorithms to analyze video interviews.

This past October, the EEOC launched an initiative to ensure that AI and other emerging tools used in hiring and other employment decisions comply with federal civil rights laws that the agency enforces.

The new initiative will build on the EEOC’s previous work in this area. It has been examining the issue of AI, people analytics, and big data in hiring and other employment decisions since at least 2016.

“This whole issue is moving fairly quickly. Municipalities tend to pass such laws first. States move a little more slowly,” Paretti said. “As a practical matter, the EEOC doesn’t move that fast. States might want to move faster than the EEOC.”

Paretti cautioned, however, that legislation needs to be done correctly. “It is important to educate yourself on the issue and understand it before adopting anything,” he said.

While the New York City law is an attempt to address concerns from civil rights groups that the use of AEDTs in hiring decisions may result in discrimination against women and minorities, some believe the law does not go far enough.

New York City-based Surveillance Technology Oversight Project (S.T.O.P.) said the new law on algorithmic bias will “rubber-stamp discrimination” by enabling more biased AI software.

“New York should be outlawing biased tech, not supporting it. Rather than blocking discrimination, this weak measure will encourage more companies to use biased and invasive AI tools,” said Albert Fox Cahn, executive director of S.T.O.P.

Likewise, Washington, D.C.-based Center for Democracy & Technology (CDT) issued a statement calling it “deeply flawed”, saying the law was a water-downed version of what was originally introduced and focuses only on race and gender, while ignoring other factors such as disability and age discrimination.

CDT also faulted the law for focusing on the hiring process and ignoring the use of AEDTs in determining compensation and scheduling. It noted that the final version which became law only applies to workers who are residents of New York City, rather than to all employees of New York City-based employers.

CDT said the original draft of the New York City audit bias law was already the most significant proposed legislation on algorithmic, data-driven, or AI-powered human resources tools in the U.S. to date. It said the final and weaker version which has become law might be viewed by other jurisdictions as a potential model for regulating AEDTs. “That would be a serious mistake,” said CDT.