1. Home
  2. |Insights
  3. |Artificial Intelligence in Employment: Second Hearing on NYC Automated Employment Decision Tools Proposed Rules and Upcoming EEOC Hearing

Artificial Intelligence in Employment: Second Hearing on NYC Automated Employment Decision Tools Proposed Rules and Upcoming EEOC Hearing

Client Alert | 3 min read | 01.24.23

On January 23, 2023, the New York City Department of Consumer and Worker Protection (DCWP) held a public hearing on its revised proposed rule on Local Law 144, which requires employers and employment agencies to procure and publish the results of an independent bias audit prior to using automated employment decision tools (AEDT).  Multiple stakeholders provided comments and testimony in response to the updated rule, many of them focusing in particular on the rule’s narrowed definition of AEDT, the scope of the required bias audit, and the notice requirements of the proposed rule.

Proposed Definition of AEDT

Local Law 144 currently defines AEDT as computational processes “derived from machine learning, statistical modelling, data analytics, or artificial intelligence” that issue a “simplified output,” e.g., a score, classification, or recommendation, which “is used to substantially assist or replace discretionary decision making.” The proposed regulations clarify that an AEDT “substantially assist[s]” in “discretionary decision making” when an employer relies solely upon, or weighs more heavily the AEDT’s output, or uses the simplified output to overrule conclusions from other factors, including human decision-making.   Notably, however, the revised version of the proposed rules removed from this definition situations in which a simplified output is used not to overrule, but merely to modify, human decision-making.  Many stakeholders objected to this change, suggesting that this definition is overly stringent and would severely limit the effectiveness of the law. These stakeholders suggested that this definition does not align with existing definitions of AEDTs, such as that used in the Blueprint for an AI Bill of Rights, issued by the White House.  Additionally, some stakeholders objected to the definition of “machine learning, statistical modelling, data analytics, or artificial intelligence,” noting that the cross-validation component of the definition is problematic because cross-validation is not a necessary step, and AEDT providers could simply release models without cross-validation that would be exempt from the law. 

Bias Audit Requirement

A number of participants also emphasized concerns with the scope of the bias audit requirement. Proposals included expanding the bias audit requirement beyond only assessing the disparate impact of the tool, including mechanisms to confirm audit validity, and requiring that bias audit results be made public.

Notice Requirement

Several stakeholders commented on the proposed requirements that an employer provide notice to a candidate or employee who resides in New York City that an AEDT will be used. In particular, one participant noted that the 10-day notice requirement could have the effect of disadvantaging candidates from New York City who would be subject to the notice requirement, as the hiring process typically runs on a much faster timeline.    

DCWP must now decide whether it will make any further amendments to its proposed rule.  It has not yet announced when it will be releasing the final rule or whether the enforcement date will be further extended. The law will become effective on April 15, 2023, absent further extensions of the effective date.

Upcoming EEOC Hearing

The EEOC has also indicated that addressing discrimination through the use of automated employment decision tools is a priority. On January 31, 2023, at 10:00 a.m. ET, the EEOC will hold a livestreamed hearing titled “Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier.” Interested parties may register for the hearing here.  

The hearing follows the publication of the EEOC’s Strategic Enforcement Plan (SEP) for 2023 to 2027, which was published in the Federal Register on January 10. In the SEP, the EEOC outlines a number of its “subject matter priorities” for the upcoming years. Included in those priorities is the goal of eliminating barriers in recruiting and hiring. The SEP lists a number of practices and policies in hiring and recruitment that it believes may be discriminatory, including: (1) “the use of automated systems, including artificial intelligence or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups”; (2) “restrictive application processes or systems, including online systems that are difficult for individuals with disabilities or other protected groups to access”; and (3) “screening tools that or requirements that disproportionately impact workers based on their protected status, including those facilitated by artificial intelligence or other automated systems, pre-employment tests, and background checks.”

Insights

Client Alert | 8 min read | 11.21.24

New Legislation Introduced in Congress Proposes Ending Normal Trade Relations with China and More

On November 14, 2024, Rep. John Moolenaar (R-Mich.), chair of the House Select Committee on the Chinese Communist Party, introduced the Restoring Trade Fairness Act, seeking to suspend China’s Permanent Normal Trade Relations (“PNTR”) status....