De-Terminator: NYC’s AI Disclosure Hiring Law and the Struggle for Corporate Compliance
By: Nidia Mendoza*
“After careful review by our AI hiring committee, unfortunately your application didn’t quite make the cut.” Seems a little unfair, doesn’t it? It would be even more unfair if you were rejected by AI without notice of its use. With the increasing integration of AI into everyday activities in education, the workplace, and beyond, New York is among the few states taking steps to regulate its use.[1] In 2021, the New York City Department of Consumer and Worker Protection (DCWP) enacted an AI-based hiring law, Local Law 144 of 2021 (Local Law 144), that took effect on July 25, 2023.[2]
Local Law 144 requires New York City employers and employment agencies using an automated employment decision tool (AEDT) to conduct annual bias audits and provide notice of the use of AEDTs to their employees and applicants.[3] AEDTs use a process of machine learning, statistical modeling, data analysis or artificial intelligence (AI) “to substantially assist or replace discretionary decision-making for employment decisions that impact” applicants and employees.[4]
Employers and employment agencies required to adhere to the regulations are those whose job locations either (1) include New York City, at least part time, or (2) are fully remote but linked to an office based in New York City.[5] Failure to comply with such regulations will result in civil penalties from $500 to $1,500 for each violation.[6]
Before an employer may use an AEDT, it must conduct a bias audit, which involves an impartial independent auditor performing “calculations of selection or scoring rates and the impact ratio across sex categories, race/ethnicity categories, and intersectional categories.”[7] The results of an employer’s bias audit must be published and include: The date of the most recent bias audit, the source and explanation of the data used, the number of individuals falling within an unknown category, the number of applicants or candidates, the selection or scoring rates, and the impact ratios for all categories.[8]
AI models cannot avoid bias, as they are trained on human-constructed internet data that may include racial discrimination and gender stereotypes.[9] AI is susceptible to generating biased outcomes with four prevalent types of biases found in AI programs: reporting bias, selection bias, group attribution bias, and implicit bias.[10] Notably, hiring algorithms have been observed favoring men over women and associating certain discriminatory characteristics with particular jobs.[11] Despite attempts to eliminate bias-inducing characteristics, these algorithms have even inferred individuals’ race based on their home addresses.[12] It is imperative that these AI-based hiring systems be screened for bias and be taught how to mitigate discriminatory patterns in order to foster fair and equitable decision making in the hiring process.
However, there are currently no specific constraints on the data used for bias audits. This lack of constraint allows employers’ discretion in choosing the data, as long as employers disclose and explain its use and any limitations.[13] This flexibility may lead to varied and inconsistent approaches to bias audits, raising concerns about the results across employers. If the state of New York wants to ensure transparency, fairness, and accountability in the use of AEDTs, the lack of standardized criteria for data selection must be resolved.
After conducting a bias audit, if an employer decides to proceed with its AEDT to evaluate applicants, it is imperative to put applicants on notice. Additionally, employers must disclose the job qualifications or characteristics the AEDT will evaluate.[14] Such notice must be given 10 business days before the use of AEDT to screen applicants or employees.[15]
Despite such broad requirements for disclosing the use of AI, numerous employers have still yet to comply. A study at Cornell University revealed a significant gap in adherence, with only 18 out of 400 employers having conducted bias audits and 11 posting transparency notices.[16] This lack of compliance may stem from the vague requirements implemented by the DCWP. Moreover, the only apparent means of detecting violations of Local Law 411 is through submission of a complaint.[17] This raises a critical question: How can individuals ascertain whether a company is violating the law when the responsibility lies with the employers to disclose their use of AI in the hiring process?
The timing and specific procedures for implementing stringent requirements regarding bias audits and notifications remain uncertain, but their occurrence is inevitable. Employers can publish bias audit reports on their websites and notices can be made available in the employment sections of their websites. In the interim, it is advisable to prioritize compliance over potential fines.
* J.D. Candidate, Class of 2025, Sandra Day O’Connor College of Law at Arizona State University.
[1] Benjamin Lerude, States Take the Lead on Regulating Artificial Intelligence, Brennan Ctr. for Just. (Nov. 6, 2023), https://www.brennancenter.org/our-work/research-reports/states-take-lead-regulating-artificial-intelligence#:~:text=Though%20most%20states%20have%20yet,government%2Dorganized%20entities%20to%20increase.
[2] DCWP, Automated Employment Decision Tools: Frequently Asked Questions, NYC Consumer and Worker Prot. (June 6, 2023), https://www.nyc.gov/assets/dca/downloads/pdf/about/DCWP-AEDT-FAQ.pdf.
[3] Id.
[4] N.Y.C. Admin. Code § 20-870.
[5] DCWP, supra note 2.
[6] N.Y.C. Admin. Code § 20-872.
[7] DCWP, supra note 2.
[8] Id.
[9] Will Douglas Heaven, These Six Questions Will Dictate the Future of Generative AI, MIT Tech. Rev. (Dec. 19, 2023),https://www.technologyreview.com/2023/12/19/1084505/generative-ai-artificial-intelligence-bias-jobs-copyright-misinformation/.
[10] Google for Developers, Fairness: Types of Bias (July 18, 2022) https://developers.google.com/machine-learning/crash-course/fairness/types-of bias#:~:text=A%20common%20form%20of%20implicit,affirm%20preexisting%20beliefs%20and%20hypotheses.
[11] Id.
[12] Id.
[13] DCWP, supra note 2.
[14] Id.
[15] Id.
[16] Lucas Wright, New Analysis Finds Few Companies Following NYC AI Hiring Law, Cornell Univ. Media Rels. Off.(Jan. 22, 2024), https://news.cornell.edu/media-relations/tip-sheets/new-analysis-finds-few-companies-following-nyc-ai-hiring-law.
[17] DCWP, supra note 2.