Trends-AU

Colorado leads nation on AI healthcare regulations

This article appears in the Winter 2025 issue of ColoradoBiz under the headline, Colorado leads in regulating AI use in healthcare.

From revolutionizing diagnostics and drug discovery to streamlining administrative tasks, artificial intelligence (AI) is bringing advancements to the medical field.

But as AI systems become integrated into the nation’s healthcare infrastructure, questions are arising regarding data privacy, algorithmic bias and the impact on the human element of care.

Colorado is at the forefront of the debate. It’s the first state to enact regulations governing the use of the technology in healthcare systems. Colorado’s Artificial Intelligence Act is set to take effect on June 30, 2026, and will impose governance and disclosure requirements on healthcare providers deploying AI systems.

According to an article co-authored by Charles Gass, senior counsel at Foley & Lardner in Denver, the act “primarily seeks to mitigate algorithmic discrimination, defined as AI-driven decision-making that results in unlawful differential treatment or disparate impact on individuals based on certain characteristics, such as race, disability, age or language proficiency. The Act seeks to prevent AI from reinforcing existing biases or making decisions that unfairly disadvantage particular groups.”

If it goes into effect, the act will be enforced by the Colorado attorney general.

While various bills have been proposed at the federal level, Congress has not approved legislation addressing AI in healthcare.
The most recent development is a Republican-backed bill that would preempt states from regulating AI models for the next decade, with limited exceptions. If it passes, it will supplant Colorado’s law.

“AI is going to be used and in some cases without patients’ permission,” Gass said. “Patients should ask whether their provider is using AI to assist administratively or clinically.”

A client of Gass reported being uncomfortable when the healthcare provider said AI would be used to take notes and asked that it not be used.

“The reaction from the provider was almost disgruntled,” Gass said. “Patients should ask whether they’re using AI to assist administratively or clinically. It’s going to be used in some cases without their permission.”

Although AI is promising for transforming healthcare by improving diagnostics and personalizing treatment plans, the data it learns from can cause the technology to introduce bias.

Data from certain demographics can lead to AI systems performing poorly for underrepresented groups like women, racial and ethnic minorities or elderly patients.

For example, AI models trained predominantly on lighter skin tones have shown less accuracy in detecting skin cancer in people with darker skin, and algorithms for diagnosing cardiac conditions may be less accurate for women if trained mainly on male data.

“Certain clinical conditions manifest differently in different populations,” Gass said. “If it’s only trained on a certain population, that’s where we really need to rely on our providers to still be in the driver’s seat for patient care. They need to be in tune and aware for patients’ needs and not rely on it too heavily.”

Practices that use AI for scheduling must also be careful of language barriers. If a patient’s primary language is Spanish and the AI tool doesn’t understand them and scheduling an appointment doesn’t happen, it could be considered discrimination, Gass said.
“It’s probably only going to get harder to get to a human, so there are concerns around how those tools are trained,” he said.

Elevating satisfaction, battling burnout

Despite the legal complexities that come with using AI in healthcare, it’s a useful tool for providers.
Denver Health, for example, uses Nabla’s ambient AI assistant to summarize conversations between healthcare providers and their patients.

“Nabla has really high levels of security and privacy guardrails,” said Dr. Daniel Kortsch, associate chief medical information officer at Denver Health. “The transcription is heavily encrypted and available only to the medical provider. There are no stored audio files or transcriptions.”

Patients report a higher satisfaction rate with their experience because the doctor is interacting with them more, rather than entering information into a computer.

But if a patient asks the provider not to use AI, their wishes will be respected, Kortsch said.

Another benefit of AI is relieving burnout among Denver Health employees, which has improved employee retention.

“Providers are more willing to continue their clinical hours and work at Denver Health,” Kortsch said. “Staff tell me they were planning to quit but didn’t because of this product. I had a doctor give me a box of chocolates when we rolled it out. So many people take work home and sit in their pajamas typing their notes.”

Kortsch also notes that Epic Systems, the world’s largest electronic health record, has created Epic Cosmos, which is using its datasets to improve patient care. Cosmos can predict how some medications impact blood pressure or diabetes care, he said.
“Ambient AI is the most transformational product I’ve seen since the advent of the electronic health record,” Kortsch said. “These types of things need to be embraced.”

Transforming rural healthcare

Artificial intelligence can also be useful in rural areas where people have difficulties accessing healthcare.

Colorado State University is part of a multi-institution project that’s developing mobile clinics equipped with AI systems to bridge gaps in rural healthcare.

The $25 million project, led by the University of Michigan, is being funded through the Advanced Research Projects Agency for Health, which supports the development of transformative biomedical and health discoveries.

The goal is to develop AI systems that can make diagnoses, run and interpret tests and perform procedures from a mobile setup parked in remote areas.

The system may be able to provide instructions on performing procedures, such as ultrasounds, that some providers may not be familiar with.

“A general practitioner at a local clinic may not have the knowledge of some specialized machines,” said Nikhil Krishnaswamy, an assistant professor of computer science at CSU. “An AI system can walk them through specialized procedures to get images that can be sent off to a radiologist.”

AI systems are not designed to replace human judgment but rather to assist.

“You don’t want AI to make diagnoses without human input,” Krishnaswamy said. “You need to have the human in the loop to make that final decision.”

a

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button