Practioners

Community Hospital ‘listening’ AI; doctor discusses risks | Local News

ANDERSON — A new artificial intelligence will be used throughout Community Health Network, including at Community Hospital Anderson.

Community Health Network will be introducing DAX Express within the coming weeks. DAX is an AI that will document patients’ medical conditions and complaints during doctor’s office visits.

Dr. Patrick McGill, executive vice president for Community Health Network, described DAX as an “ambient listening” form of AI.

“It listens during the office visit, (it listens) to the conversation between the patient and the clinician and generates the progress note (patient chart) that the clinician will then review,” he explained.

“It’s really designed to get the clinician off of the keyboard during the visit and have more of a person-to-person interaction.”

Despite potential benefits, the use of AI in the healthcare field, raises concerns of security and bias.

Data used by DAX would likely be medical records which could contain confidential information. McGill said unlike AI platforms like the popular Chat GPT, DAX is not connected to the internet, meaning the recorded data cannot be obtained via internet.

“It’s a secure portal for patient information,” McGill said.

McGill clarified that DAX will be used with the patient’s consent.

In regards to bias, McGill said since DAX merely listens and records, it is unlikely to yield biased results.

Limiting and eliminating bias comes down to monitoring the AI and the data used by it.

“(You) start with awareness of bias in the models and then how do you build the models to be able to address it and overcome it.”

AI bias could pose a pose a danger to minorities, especially when their health is involved.

A 2021 article published in the data science journal Patterns reported that an AI algorithm used cost data to predict how much care a patient would need; less cost meant less care.

The cost was lower for Black patients, leading them to receive less care for diseases including diabetes and kidney disease.

Such data doesn’t take into consideration the lack of access for Black Americans due to high costs and lack of trust in the healthcare system, according to Veda Morris May, executive director for the Minority Health Coalition.

“If they don’t feel comfortable going to doctors, of course, they’re not going to go,” she said.

May said the Minority Health Coalition of Madison County works to build that trust.

The U.S. Department of Health and Human Services’ Office of Minority Health suggests possible remedies for AI bias, which include:

  • Having a diverse body of people review and supervise the AI.
  • Working with diverse communities to ensure the algorithms don’t cause harm.
  • Introducing the algorithms (the means by which AIs receive instructions) gradually and carefully.

May recommended those monitoring the AI know the community they serve.

“Are they serving them at the rate that they should be and the quality that they should be? If you’re going to use those numbers, you have to be sure that your data is good.”

No Byline Policy

Editorial Guidelines

Corrections Policy

Source

Leave a Reply