Navigate Obamacare nondiscrimination regulations for healthcare AI customers with ISO 42001
What AI health-tech companies need to know about new rules for "patient care decision support tools."
Last week an AI-powered health-tech startup founder flagged this huge compliance issue:
“The chief data scientist of a very big health system…wasn’t even aware of the [Section] 1557 [of the Affordable Care Act (ACA) requirements] for care decision support tools.”
So I wrote this post to explain:
what it means for health-tech firms selling AI, and
how ISO 42001 can help them drive revenue by
easing customer compliance concerns.
What is Section 1557 of the ACA (also known as Obamacare) and the relevant regulation?
14 years after Obamacare passed, regulations based on it are still just coming into force.
For non-nerds, regulations turn laws passed by Congress into detailed (and enforceable) guidance.
One of them, called "Nondiscrimination in Health Programs and Activities" and implementing Section 1557 of the ACA, was finalized in May of 2024.
Why should AI health-tech companies care about it?
Section 92.210 of the regulation requires health:
insurance issuers
providers
programs
receiving federal funding (a huge segment for health-tech customers) to meet these requirements when using "patient care decision support tools" (PCDST):
not discriminate based on protected characteristics
track PCDST that "employ [them as] input variables"
mitigate risk of discrimination from these PCDSTs.
What are PCDSTs?
The regulation defines them as any:
(non-)automated tool, mechanism, method, or tech
a covered entity uses for clinical decision-making
in its health programs or activities
A previous draft used the term "clinical algorithm," but this is the final (paraphrased) definition.
And it covers a wide variety of tools (including most AI, by my reading).
So why should AI healthcare firms care about this, again?
Your customer compliance programs are about to go into overdrive. This requirement goes live in March 2025 and few appear aware of it!
If you can't give them the answers they need, expect:
Lots of questionnaires and calls
Stalled deployments
Slower sales cycles
There is good news, though.
ISO 42001 certification can help here.
Which ISO 42001 requirements show to covered entities (your customers) they can comply with Section 1557?
External review of AI risk, impact, and treatment plans
Evaluation of fairness, transparency, and safety
Tracking of data provenance, quality, and bias
AI inventory through lifecycle management
Internal auditing of the above
Section 1557 can have a huge impact on health-tech companies selling AI
ISO 42001 is a comprehensive but also flexible way to build your AI Management System (AIMS) and have it externally audited. StackAware helps AI-powered health-tech companies like Eleos get ISO 42001 ready so they close customers faster by building customer trust and managing risk.