HHS Applies Discrimination Prohibitions to Use of Automated and Non-Automated Patient Care Decision Support Tools
Client Alert | 7 min read | 05.08.24
On Monday, May 6th, the Office of Civil Rights (“OCR”) at the U.S. Department of Health and Human Services (“HHS”) published a final rule to implement Section 1557 of the Affordable Care Act[1] (“Section 1557”), which prohibits discrimination on the basis of race, color, national origin, age, disability, or sex (including pregnancy, sexual orientation, gender identity, and sex characteristics), in covered health programs or activities (the “Final Rule”). Here, we focus on the Final Rule’s application of nondiscrimination principles under Section 1557 to the use of “patient care decision support tools” in clinical care. This Final Rule responds to the President’s Executive Order on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence and comments received regarding the proposed rule for Section 1557. Other key provisions of the Final Rule, including the restoration of protections for LGBTQI+ and pregnant individuals, are summarized in a companion alert.
Background
The Final Rule builds on the Biden-Harris administration’s focus on health equity by strengthening protections against discrimination in health care and health technology and by clarifying the broad scope of such protections. Among other key provisions, the Final Rule requires that covered entities meet certain standards when using “patient care decision support tools” which include, but are not limited to artificial intelligence (“AI”), clinical algorithms, and non-automated decision-making tools.
Key Provisions
The Final Rule clarifies that any health program or activity that receives federal financial assistance, even in part, along with any program or activity that is administered by an executive agency and any entity established under Title I of the Affordable Care Act (“ACA”) or its amendments must abide by Section 1557’s nondiscrimination requirements. Importantly, the Final Rule expressly provides notice that Medicare Part B meets the definition of federal financial assistance for Section 1557’s purposes. Additionally, the Final Rule explicitly includes health insurance plans as covered entities under Section 1557, and requires that all covered entities meet certain standards such as proactively notifying individuals that accessibility services are available to patients at no cost. Furthermore, OCR applies accessibility standards in Section 1557 to telehealth services, and protects LGBTQI+ patients from discrimination based on sex. For a complete summary of such key provisions, please visit our companion alert.
Using Patient Care Decision Support Tools
In the Final Rule, OCR takes action to consider how the increasing use of AI, among other tools, in health programs and activities could lead to discrimination and applies the nondiscrimination principles under Section 1557 to the use of “patient care decision support tools” in clinical care. Accordingly, the Final Rule requires covered entities (e.g., hospitals, health care providers, health insurance issuers) to take steps to identify and mitigate discrimination when they use AI and other forms of decision support tools for care.
In Section 92.210 of the Final Rule, OCR expressly prohibits a covered entity from discriminating on the basis of race, color, national origin, sex, age, or disability in its health programs or activities through the use of “patient care decision support tools,” defined below. This section specifically obligates the covered entity to:
- Make reasonable efforts to identify uses of “patient care decision support tools” in its health programs or activities that employ input variables or factors that measure race, color, national origin, sex, age, or disability on an ongoing basis; and
- For each patient care decision support tool, make reasonable efforts to mitigate the risk of discrimination resulting from the tool’s use in its health programs or activities.
The Final Rule creates a new term: “patient care decision support tool,” which is defined as “any automated or non-automated tool, mechanism, method, technology, or combination thereof used by a covered entity to support clinical decision-making in its health programs or activities.” According to OCR, this term is used to make clear that Section 92.210 of the Final Rule applies to the use of tools by covered entities that support clinical decision-making that impacts patient care. The Final Rule both adds specificity through the precise definition of “patient care decision support tool” and clarifies the breadth of tools covered by Section 92.210 which, according to the definition of “patient care decision support tools,” includes not only automated tools such as software, but also non-automated tools such as flowcharts used for triage in direct response to comments to the proposed rule for Section 1557. In practice, such tools may be used for a wide variety of reasons that impact patient care such as to assess patient health status, determine eligibility for certain care, analyze medical necessity, and make recommendations related to care and disease management.
Importantly, OCR limits Section 92.210 to tools related to clinical decision-making that affect patient care and are within a covered entity’s health programs or activities. The Final Rule does not apply to the use of tools for administrative or billing related activities, patient scheduling, or facilities management, for example. OCR also notes that the purpose of Section 92.210 is to ensure that use of “patient care decision support tools” is not occurring in a discriminatory manner, not to prevent use of such tools all together.
Specific Obligations for Covered Entities Using Patient Care Decision Support Tools
In response to OCR’s proposed rule for Section 1557, commenters requested clarification on the steps that covered entities should take to comply with Section 92.210 to ensure their use of “patient care decision support tools” do not result in discrimination, since covered entities generally are not tool developers and typically rely on the tools for the developers’ intended uses. In addition to the requirements for making “reasonable efforts” as described above, OCR acknowledges that covered entities may not be aware of the datasets used by developers to train “patient care decision support tools,” and does not require that the covered entity obtain such data sets as part of their obligations. However, OCR notes that if the covered entity has reason to believe that “patient care decision support tools” use variables that measure race, color, national origin, sex, age, or disability, or otherwise knows or should know that the tool could result in discrimination, the covered entity should consult publicly available sources or the developer. This obligation works in coordination with the Office of the National Coordinator for Health Information Technology’s recently published final rule requiring certain developers to disclose specific information about decision support interventions.[2] OCR declines to require covered entities to take specific risk mitigation efforts under Section 92.210(c) of the Final Rule, noting that the “reasonable efforts” standard appropriately balances the need for covered entities to protect against discrimination against the burden of doing so, while still allowing covered entities to implement more robust safeguards against discrimination if they choose to do so.
In terms of enforcement, OCR will assess each allegation of a violation of Section 92.210 on a case-by-case basis and may consider factors such as the covered entity’s size and resources, whether the covered entity used tools in a manner intended by the developer and approved by regulators, whether the covered entity received product information from the tool’s developer regarding variables that may lead to discrimination, and whether the covered entity has a process in place for evaluating “patient care decision support tools.” Of note, OCR currently seeks comments on whether it should expand the scope of Section 92.210 to tools that do not directly impact patient care and clinical decision-making but may still result in discrimination of violation of Section 1557.
Takeaways
Under Section 92.210, covered entities must take steps to identify the “patient care decision support tools,” which include both automated and non-automated tools that impact patient care, used in their health programs and activities that may result in discrimination on the basis of race, color, national origin, sex, age, or disability. Upon such identification, covered entities must make reasonable efforts to mitigate the risk of discrimination resulting from each of these tool's use in its health programs or activities. In order to sufficiently comply with these new identification and mitigation requirements under Section 92.210, within 300 days of the Final Rule’s effective date (July 5, 2024), covered entities must establish and implement policies and procedures to assess their use of “patient care decision support tools” and their potentially discriminatory impact on patient care. Given the widespread and increasing use of “patient care decision support tools” in health care, the Final Rule establishes meaningful and significant responsibilities for covered entities and health technology companies developing such tools.
Our team at Crowell & Moring is ready to assist as you work to understand your obligations under the Final Rule and how to implement safeguards against discrimination resulting from the use of “patient care decision support tools.” For more information about the Final Rule, please contact the professionals listed below, or your regular Crowell contact.
[1] 42 U.S.C. § 18116.
[2] 45 CFR 170.102; U.S. Dep't of Health & Hum. Servs., Off. of the Nat'l Coordinator for Health Info. Tech., Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing, Final Rule, 89 FR 1192 (January 9, 2024).
Insights
Client Alert | 8 min read | 11.21.24
New Legislation Introduced in Congress Proposes Ending Normal Trade Relations with China and More
On November 14, 2024, Rep. John Moolenaar (R-Mich.), chair of the House Select Committee on the Chinese Communist Party, introduced the Restoring Trade Fairness Act, seeking to suspend China’s Permanent Normal Trade Relations (“PNTR”) status.
Client Alert | 5 min read | 11.21.24
OFAC Issues Necessary and Long-Awaited Updated Guidance for (Re)Insurance Industry
Client Alert | 9 min read | 11.20.24
2024 GAO Bid Protest Report Shows Notable Decrease in Merit Decisions