Numerous Outstanding Questions Remain About New York City’s Law Regulating the Use of Artificial Intelligence in Employment Decisions
NYC Employers Face Ambiguities In AI Bias Law
Who is and is not covered by the law?
Public commentary has highlighted confusion about one of the more glaring issues with the law: Which employers are subject to its regulations? This confusion is in part sewn by references to both city residents and employers. For instance, the law states that, “the term ‘employment decision’ means to screen candidates for employment or employees for promotion within the city,”i suggesting that it is applicable to employers with operations in the city. In the Notices section of the law, however, it states that it applies to a job candidate who “resides in the city.”ii
What selection practices are, or might be, considered automated employment decision tools?
Further related to scope, the definition provided by the law is quite broad and may encompass a much wider range of tools than intended:
“The term ‘automated employment decision tool’ means any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons”iii
This clearly covers tools using artificial intelligence methods; however, it may also extend to traditional selection tools (e.g., online multiple-choice tests, automatically scored application blanks) where statistical modeling or data analytics are applied to configure use for a particular employer or job. Further, many selection tools are formulaically scored and are used to assist in making decisions about which candidates progress in the selection process. These tools have operated this way for decades, and it is unclear whether the intent of the law was to regulate these tools in addition to those leveraging more recent, sophisticated data-science approaches.
As reported in a previous Law360 article, the NYC AEDT law, one of the broadest in this area passed to date, will require employers using selection tools that leverage artificial intelligence, machine learning, natural language processing, or other automated algorithms to: (1) conduct an independent bias audit of each AEDT; (2) make the audit results publicly available on its website; and (3) notify candidates (a) that the selection process includes an AEDT, (b) what job qualifications and characteristics are being evaluated by the AEDT, and (c) that they are allowed to request an alternative selection process or an accommodation. Employers failing to meet audit or notice requirements are subject to fines: $500 for the first violation and $500–$1,500 for each subsequent violation.